Towards Independent Stress Detection: A Dependent Model Using Facial Action Units

Our society is increasingly more susceptible to chronic stress. Reasons are daily worries, workload, and the wish to fulfil a myriad of expectations. Unfortunately, long-exposure to stress leads to physical and mental health problems. To avoid the described consequences, mobile applications have been studied to track stress in combination with wearables. However, wearables need to be worn all day long and can be costly. Given that most laptops have inbuilt cameras, using video data for personal tracking of stress levels could be a more affordable alternative. In previous work, videos have been used to detect cognitive stress during driving by measuring the presence of anger or fear through a limited number of facial expressions. In contrast, we propose the use of 17 facial action units (AUs) not solely restricted to those emotions. We used five one-hour long videos from the dataset collected by Lau [1]. The videos show subjects while typing, resting, and exposed to a stressor, being a multitasking exercise combined with social evaluation. We performed binary classification using several simple classifiers on AUs extracted in each video frame and were able to achieve an accuracy of up to 74% in subject independent classification and 91% in subject dependent classification. These preliminary results indicate that the AUs most relevant for stress detection are not consistently the same for all 5 subjects. Also in previous work, using facial cues, a strong person-specific component was found during classification.

[1]  Miguel A. Labrador,et al.  mStress: A mobile recommender system for just-in-time interventions for stress , 2017, 2017 14th IEEE Annual Consumer Communications & Networking Conference (CCNC).

[2]  Javier Hernandez,et al.  Call Center Stress Recognition with Person-Specific Models , 2011, ACII.

[3]  Mary Czerwinski,et al.  Under pressure: sensing stress of computer users , 2014, CHI.

[4]  Jennifer Healey,et al.  Detecting stress during real-world driving tasks using physiological sensors , 2005, IEEE Transactions on Intelligent Transportation Systems.

[5]  Martin Gjoreski,et al.  Real-time physical activity and mental stress management with a wristband and a smartphone , 2017, UbiComp/ISWC Adjunct.

[6]  Tom Cox,et al.  The Cost of Work-Related Stress to Society: A Systematic Review , 2018, Journal of occupational health psychology.

[7]  F. Smit,et al.  A health economic outcome evaluation of an internet-based mobile-supported stress management intervention for employees. , 2017, Scandinavian journal of work, environment & health.

[8]  Jean-Philippe Thiran,et al.  Detecting emotional stress from facial expressions for driving safety , 2014, 2014 IEEE International Conference on Image Processing (ICIP).

[9]  Mykola Pechenizkiy,et al.  Stress detection from speech and Galvanic Skin Response signals , 2013, Proceedings of the 26th IEEE International Symposium on Computer-Based Medical Systems.

[10]  Oscar Mayora-Ibarra,et al.  Automatic Stress Detection in Working Environments From Smartphones’ Accelerometer Data: A First Step , 2015, IEEE Journal of Biomedical and Health Informatics.

[11]  Georgy L. Gimel'farb,et al.  Unsupervised Stress Detection Algorithm and Experiments with Real Life Data , 2017, EPIA.

[12]  Takeo Kanade,et al.  The Extended Cohn-Kanade Dataset (CK+): A complete dataset for action unit and emotion-specified expression , 2010, 2010 IEEE Computer Society Conference on Computer Vision and Pattern Recognition - Workshops.

[13]  Ahmad R. Hariri,et al.  Facial Expressions of Emotion Reveal Neuroendocrine and Cardiovascular Stress Responses , 2007, Biological Psychiatry.

[14]  Fernando De la Torre,et al.  Dynamic Cascades with Bidirectional Bootstrapping for Action Unit Detection in Spontaneous Facial Behavior , 2011, IEEE Transactions on Affective Computing.

[15]  Jean-Philippe Thiran,et al.  Action Units and Their Cross-Correlations for Prediction of Cognitive Load during Driving , 2017, IEEE Transactions on Affective Computing.

[16]  Begoña García Zapirain,et al.  A Stress Sensor Based on Galvanic Skin Response (GSR) Controlled by ZigBee , 2012, Sensors.

[17]  Peter Robinson,et al.  OpenFace: An open source facial behavior analysis toolkit , 2016, 2016 IEEE Winter Conference on Applications of Computer Vision (WACV).

[18]  Manolis Tsiknakis,et al.  Stress Detection from Speech Using Spectral Slope Measurements , 2016, MindCare/Fabulous.

[19]  Martin L. Griss,et al.  Activity-Aware Mental Stress Detection Using Physiological Sensors , 2010, MobiCASE.

[20]  Matevz Pogacnik,et al.  Noninvasive stress recognition considering the current activity , 2015, Personal and Ubiquitous Computing.

[21]  Geoffrey E. Hinton,et al.  Visualizing Data using t-SNE , 2008 .

[22]  Juan Carlos Augusto,et al.  New Methods for Stress Assessment and Monitoring at the Workplace , 2019, IEEE Transactions on Affective Computing.