Our Actions, Ourselves: How Unconscious Actions Become a Productivity Indicator

Productivity always was, and still is, the main goal of organizations, that being economic, governmental, military or educational. Having the means to control, detect and monitor features that have impact on productivity is a major issue, and subject to various investigation. Considering that most of the times, if not always, unconscious actions play a very important role in the way we work, study, socialize, and even in the way we have fun, the high significance of those factors becomes very clear. Monitoring unconscious actions, selecting those of them that do play a role regarding productivity, and trying to proactively take measures to improve processes, is then the goal of this work. Specifically, we are concerned about using computers peripherals to non-intrusively monitor user’s actions. The term non-intrusively assumes greater importance, as we are concerned with unconscious actions, thus we need to strongly ensure that no entropy is derived by the way this process is done. Peripherals such as mouse, keyboard, touch screens, and possibly webcams and microphones can act as sensors, completely hidden from the user. As we use them daily, they somehow assume part of our life, and can be used to collect data that will be processed to get useful information regarding that particular user. We then can build a behavioral profile, for instance, that will provide a better insight of user’s actions. We can predict some possibly negative features, such as stress, fatigue, level of attention, for instance. If detected or predicted, they can greatly help to better manage all the information we need, in the right way. We can suggest that someone takes a coffee break, because she/he is stressed. We can tell him/her to work/study in the morning, because the information we have collected suggests that is the period of the day that is more suitable to get better results, for that person. We can suggest postponing the following meeting, because the actual mood indicates that that person is more suitable to conflicts at that moment. In short, we aim to use computer peripherals and smartphones to collect data from the user, non-intrusively, aiming to detect or predict behavioral features (stress, fatigue, attention) and unconscious actions that will allow us to build a behavioral profile about the user, thus making it possible to improve productivity for instance. This is accomplished by monitoring mouse, keyboard and touch screen usage, non-intrusively, and in real time. Through the collected data, some inferences are made regarding the patterns of interaction that make it possible to detect variations regarding behavioral features and unconscious actions. With this information, we can have a detailed insight into those behavioral features, allowing us to proactively mitigate some of the potential problems that could arise. In this work a framework is proposed as a way to integrate all these features. We aim mainly to apply these concepts in learning contexts, as to improve student’s outcomes.

[1]  Agata Kolakowska,et al.  A review of emotion recognition methods based on keystroke dynamics and mouse movements , 2013, 2013 6th International Conference on Human System Interactions (HSI).

[2]  Robert J. Meijer,et al.  Sensor Data Storage Performance: SQL or NoSQL, Physical or Virtual , 2012, 2012 IEEE Fifth International Conference on Cloud Computing.

[3]  Alastair J. Gill,et al.  Indentifying Emotional Characteristics from Short Blog Texts , 2008 .

[4]  Szwoch Wioleta,et al.  Using physiological signals for emotion recognition , 2013, 2013 6th International Conference on Human System Interactions (HSI).

[5]  Agnieszka Landowska Emotion monitor - concept, construction and lessons learned , 2015, 2015 Federated Conference on Computer Science and Information Systems (FedCSIS).

[6]  Fanglin Chen,et al.  StudentLife: assessing mental health, academic performance and behavioral trends of college students using smartphones , 2014, UbiComp.

[7]  Mirco Musolesi,et al.  Sensing meets mobile social networks: the design, implementation and evaluation of the CenceMe application , 2008, SenSys '08.

[8]  Darren Gergle,et al.  Emotion rating from short blog texts , 2008, CHI.

[9]  Björn W. Schuller,et al.  Multimodal emotion recognition in audiovisual communication , 2002, Proceedings. IEEE International Conference on Multimedia and Expo.

[10]  Sungzoon Cho,et al.  Keystroke dynamics-based user authentication using long and free text strings from various input devices , 2015, Inf. Sci..

[11]  Andrzej Czyzewski,et al.  Pawlak's flow graph extensions for video surveillance systems , 2015, 2015 Federated Conference on Computer Science and Information Systems (FedCSIS).

[12]  James A. Landay,et al.  The Mobile Sensing Platform: An Embedded Activity Recognition System , 2008, IEEE Pervasive Computing.

[13]  Davide Carneiro,et al.  Boosting Learning: Non-intrusive Monitoring of Student’s Efficiency , 2015 .

[14]  Kostas Karpouzis,et al.  Emotion recognition through facial expression analysis based on a neurofuzzy network , 2005, Neural Networks.

[15]  M. S. Ali,et al.  Emotion recognition through facial expression analysis using neuro-fuzzy system , 2011, ICWET.

[16]  José Neves,et al.  Detection of Distraction and Fatigue in Groups through the Analysis of Interaction Patterns with Computers , 2014, IDC.

[17]  Yong-Han Lee,et al.  MongoDB-Based Repository Design for IoT-Generated RFID/Sensor Big Data , 2016, IEEE Sensors Journal.