Emotion Recognition Using Smart Watch Sensor Data: Mixed-Design Study

Background Research in psychology has shown that the way a person walks reflects that person’s current mood (or emotional state). Recent studies have used mobile phones to detect emotional states from movement data. Objective The objective of our study was to investigate the use of movement sensor data from a smart watch to infer an individual’s emotional state. We present our findings of a user study with 50 participants. Methods The experimental design is a mixed-design study: within-subjects (emotions: happy, sad, and neutral) and between-subjects (stimulus type: audiovisual “movie clips” and audio “music clips”). Each participant experienced both emotions in a single stimulus type. All participants walked 250 m while wearing a smart watch on one wrist and a heart rate monitor strap on the chest. They also had to answer a short questionnaire (20 items; Positive Affect and Negative Affect Schedule, PANAS) before and after experiencing each emotion. The data obtained from the heart rate monitor served as supplementary information to our data. We performed time series analysis on data from the smart watch and a t test on questionnaire items to measure the change in emotional state. Heart rate data was analyzed using one-way analysis of variance. We extracted features from the time series using sliding windows and used features to train and validate classifiers that determined an individual’s emotion. Results Overall, 50 young adults participated in our study; of them, 49 were included for the affective PANAS questionnaire and 44 for the feature extraction and building of personal models. Participants reported feeling less negative affect after watching sad videos or after listening to sad music, P<.006. For the task of emotion recognition using classifiers, our results showed that personal models outperformed personal baselines and achieved median accuracies higher than 78% for all conditions of the design study for binary classification of happiness versus sadness. Conclusions Our findings show that we are able to detect changes in the emotional state as well as in behavioral responses with data obtained from the smartwatch. Together with high accuracies achieved across all users for classification of happy versus sad emotional states, this is further evidence for the hypothesis that movement sensor data can be used for emotion recognition.

[1]  Ellen Elizabeth Bartolini,et al.  Eliciting Emotion with Film: Development of a Stimulus Set , 2011 .

[2]  Terry Lyons,et al.  A signature-based machine learning model for distinguishing bipolar disorder and borderline personality disorder , 2017, Translational Psychiatry.

[3]  Juan C. Quiroz,et al.  Emotion-recognition using smart watch accelerometer data: preliminary findings , 2017, UbiComp/ISWC Adjunct.

[4]  Steven C. R. Williams,et al.  A functional MRI study of happy and sad affective states induced by classical music , 2007, Human brain mapping.

[5]  David Huron Why is sad music pleasurable? A possible role for prolactin , 2011 .

[6]  J. Gross,et al.  Emotion elicitation using films , 1995 .

[7]  N. Troje,et al.  Embodiment of Sadness and Depression—Gait Patterns Associated With Dysphoric Mood , 2009, Psychosomatic medicine.

[8]  Fakhri Karray,et al.  Survey on speech emotion recognition: Features, classification schemes, and databases , 2011, Pattern Recognit..

[9]  M. Slater,et al.  Virtual reality in the assessment, understanding, and treatment of mental health disorders , 2017, Psychological Medicine.

[10]  Richard J. McNally,et al.  Acute aerobic exercise helps overcome emotion regulation deficits , 2017, Cognition & emotion.

[11]  Emre Ertin,et al.  From Markers to Interventions: The Case of Just-in-Time Stress Intervention , 2017, Mobile Health - Sensors, Analytic Methods, and Applications.

[12]  Martin A. Giese,et al.  Show me how you walk and I tell you how you feel — A functional near-infrared spectroscopy study on emotion perception based on human gait , 2014, NeuroImage.

[13]  Zhihong Zeng,et al.  A Survey of Affect Recognition Methods: Audio, Visual, and Spontaneous Expressions , 2007, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[14]  J. Wardle,et al.  Social isolation, loneliness, and all-cause mortality in older men and women , 2013, Proceedings of the National Academy of Sciences.

[15]  B. Appelhans,et al.  Heart Rate Variability as an Index of Regulated Emotional Responding , 2006 .

[16]  Imran A. Zualkernan,et al.  Emotion recognition using mobile phones , 2016, 2016 IEEE 18th International Conference on e-Health Networking, Applications and Services (Healthcom).

[17]  Carmen C. Y. Poon,et al.  Unobtrusive Sensing and Wearable Devices for Health Informatics , 2014, IEEE Transactions on Biomedical Engineering.

[18]  Daniel Gatica-Perez,et al.  Who's Who with Big-Five: Analyzing and Classifying Personality Traits with Smartphones , 2011, 2011 15th Annual International Symposium on Wearable Computers.

[19]  Davide Anguita,et al.  Transition-Aware Human Activity Recognition Using Smartphones , 2016, Neurocomputing.

[20]  Gary M. Weiss,et al.  Limitations with activity recognition methodology & data sets , 2014, UbiComp Adjunct.

[21]  Thomas Plötz,et al.  Let's (not) stick together: pairwise similarity biases cross-validation in activity recognition , 2015, UbiComp.

[22]  Hosub Lee,et al.  Towards unobtrusive emotion recognition for affective social communication , 2012, 2012 IEEE Consumer Communications and Networking Conference (CCNC).

[23]  D. Janssen,et al.  Recognition of Emotions in Gait Patterns by Means of Artificial Neural Nets , 2008 .

[24]  Terry Lyons,et al.  Detecting early signs of depressive and manic episodes in patients with bipolar disorder using the signature-based model , 2017, 1708.01206.

[25]  F. Penedo,et al.  Exercise and well-being: a review of mental and physical health benefits associated with physical activity , 2005, Current opinion in psychiatry.

[26]  Tingshao Zhu,et al.  Emotion recognition based on customized smart bracelet with built-in accelerometer , 2016, PeerJ.

[27]  S. Koole,et al.  Embodied mood regulation: the impact of body posture on mood recovery, negative thoughts, and mood-congruent recall , 2017, Cognition & emotion.

[28]  A. Schaefer,et al.  Please Scroll down for Article Cognition & Emotion Assessing the Effectiveness of a Large Database of Emotion-eliciting Films: a New Tool for Emotion Researchers , 2022 .

[29]  Shun Li,et al.  Emotion Detection from Natural Walking , 2016, HCC.

[30]  D. Watson,et al.  Development and validation of brief measures of positive and negative affect: the PANAS scales. , 1988, Journal of personality and social psychology.

[31]  Cecilia Mascolo,et al.  EmotionSense: a mobile phones based adaptive platform for experimental social psychology research , 2010, UbiComp.

[32]  Davide Anguita,et al.  A Public Domain Dataset for Human Activity Recognition using Smartphones , 2013, ESANN.

[33]  T. Wykes,et al.  Assessing your mood online: acceptability and use of Moodscope , 2012, Psychological Medicine.

[34]  Paul Ekman,et al.  Emotional and Conversational Nonverbal Signals , 2004 .

[35]  W. Rössler,et al.  Using Smartphones to Monitor Bipolar Disorder Symptoms: A Pilot Study , 2016, JMIR mental health.

[36]  Assal Habibi,et al.  The pleasures of sad music: a systematic review , 2015, Front. Hum. Neurosci..

[37]  Konrad P. Körding,et al.  Meaningless comparisons lead to false optimism in medical machine learning , 2017, PloS one.