Visual Analytics of Work Behavior Data - Insights on Individual Differences

Stress in working environments is a recent concern. We see potential in collecting sensor data to detect patterns in work behavior with potential danger to well-being. In this paper, we describe how we applied visual analytics to a work behavior dataset, containing information on facial expressions, postures, computer interactions, physiology and subjective experience. The challenge is to interpret this multi-modal low level sensor data. In this work, we alternate between automatic analysis procedures and data visualization. Our aim is twofold: 1) to research the relations of various sensor features with (stress related) mental states, and 2) to develop suitable visualization methods for insight into a large amount of behavioral data. Our most important insight is that people differ a lot in their (stress related) work behavior, which has to be taken into account in the analyses and visualizations

[1]  Wessel Kraaij,et al.  Unobtrusive Monitoring of Knowledge Workers for Stress Self-regulation , 2013, UMAP.

[2]  Pat Hanrahan,et al.  Polaris: A System for Query, Analysis, and Visualization of Multidimensional Relational Databases , 2002, IEEE Trans. Vis. Comput. Graph..

[3]  James Davey,et al.  Guiding feature subset selection with an interactive visualization , 2011, 2011 IEEE Conference on Visual Analytics Science and Technology (VAST).

[4]  Wessel Kraaij,et al.  Collecting a dataset of information behaviour in context , 2014, CARR '14.

[5]  Dimitris N. Metaxas,et al.  Optical computer recognition of facial expressions associated with stress induced by performance demands. , 2005, Aviation, space, and environmental medicine.

[6]  Mykola Pechenizkiy,et al.  Stess@Work: from measuring stress to its understanding, prediction and handling with personalized coaching , 2012, IHI '12.

[7]  Jonathan C. Roberts,et al.  Visual comparison for information visualization , 2011, Inf. Vis..

[8]  Daniel McDuff,et al.  Acume: A new visualization tool for understanding facial expression and gesture data , 2011, Face and Gesture 2011.

[9]  Ashish Kapoor,et al.  Multimodal affect recognition in learning environments , 2005, ACM Multimedia.

[10]  Pat Hanrahan,et al.  Polaris: a system for query, analysis and visualization of multi-dimensional relational databases , 2000, IEEE Symposium on Information Visualization 2000. INFOVIS 2000. Proceedings.

[11]  Tamás D. Gedeon,et al.  Objective measures, sensors and computational techniques for stress recognition and classification: A survey , 2012, Comput. Methods Programs Biomed..

[12]  Helwig Hauser,et al.  Parallel Sets: interactive exploration and visual analysis of categorical data , 2006, IEEE Transactions on Visualization and Computer Graphics.

[13]  Helwig Hauser,et al.  Visualization and Visual Analysis of Multifaceted Scientific Data: A Survey , 2013, IEEE Transactions on Visualization and Computer Graphics.

[14]  Maarten A. Hogervorst,et al.  Physiological Correlates of Stress in Individuals about to Undergo Eye Laser Surgery , 2013, 2013 Humaine Association Conference on Affective Computing and Intelligent Interaction.

[15]  Andrew Sears,et al.  Automated stress detection using keystroke and linguistic features: An exploratory study , 2009, Int. J. Hum. Comput. Stud..

[16]  Tobias Schreck,et al.  MotionExplorer: Exploratory Search in Human Motion Capture Data Based on Hierarchical Aggregation , 2013, IEEE Transactions on Visualization and Computer Graphics.

[17]  Isabelle Guyon,et al.  An Introduction to Variable and Feature Selection , 2003, J. Mach. Learn. Res..

[18]  Tobias Schreck,et al.  Guided discovery of interesting relationships between time series clusters and metadata properties , 2012, i-KNOW '12.

[19]  Silvia Miksch,et al.  Reinventing the Contingency Wheel: Scalable Visual Analytics of Large Categorical Data , 2012, IEEE Transactions on Visualization and Computer Graphics.

[20]  Min Chen,et al.  Glyph-based Visualization: Foundations, Design Guidelines, Techniques and Applications , 2013, Eurographics.

[21]  Arthur C. Graesser,et al.  Emote aloud during learning with AutoTutor: Applying the Facial Action Coding System to cognitive–affective states during learning , 2008 .

[22]  Wessel Kraaij,et al.  The SWELL Knowledge Work Dataset for Stress and User Modeling Research , 2014, ICMI.

[23]  Wessel Kraaij,et al.  Real-time task recognition based on knowledge workers' computer activities , 2012, ECCE.

[24]  Helwig Hauser,et al.  Interactive Feature Specification for Focus+Context Visualization of Complex Simulation Data , 2003, VisSym.

[25]  Jürgen Bernard,et al.  Visual‐interactive Exploration of Interesting Multivariate Relations in Mixed Research Data Sets , 2014, Comput. Graph. Forum.

[26]  Tobias Schreck,et al.  Content-based layouts for exploratory metadata search in scientific research data , 2012, JCDL '12.

[27]  Christine L. Lisetti,et al.  HapFACS: An Open Source API/Software to Generate FACS-Based Expressions for ECAs Animation and for Corpus Generation , 2013, 2013 Humaine Association Conference on Affective Computing and Intelligent Interaction.