Detecting naturalistic expression of emotions using physiological signals while playing video games

Affective gaming has been an active research field recently. This is due to the importance of the player’s emotions while playing computer games. Emotions can be detected from various modalities such as facial, voice, and physiological signals. In this study, we evaluate an XGBoost ensemble method and deep neural network for detecting naturalistic expressions of emotions of video game players using physiological signals. Physiological data was collected from twelve participants while playing PUBG mobile game. Both Discrete and dimensional emotion models were evaluated. We evaluated the performance of classification models using individual physiological channels and a fusion of these channels. A comparison between user-dependent, and user-independent is also provided. Our results indicated that the use of the dimensional valence and arousal model can provide more accurate accuracy than the discrete emotion model. The results also showed that ECG features and a fusion of features from all physiological channels provide the highest affect detection accuracy. Our deep neural network model based on user-dependent model achieved the highest accuracy with 77.92% and 78.58% of detecting valence, and arousal respectively using a fusion of features. The user-independent models were not feasible, presumably due to strong individual differences of physiological responses.

[1]  Eugénio C. Oliveira,et al.  Real-Time Psychophysiological Emotional State Estimation in Digital Gameplay Scenarios , 2013, EANN.

[2]  G. Chanel Emotion assessment for affective computing based on brain and peripheral signals , 2009 .

[3]  William M. Campbell,et al.  Multi-Modal Audio, Video and Physiological Sensor Learning for Continuous Emotion Prediction , 2016, AVEC@ACM Multimedia.

[4]  Mustafa E. Kamasak,et al.  Emotion Based Music Recommendation System Using Wearable Physiological Sensors , 2018, IEEE Transactions on Consumer Electronics.

[5]  M. Mahadevappa,et al.  Emotion recognition based on physiological signals using valence-arousal model , 2015, 2015 Third International Conference on Image Information Processing (ICIIP).

[6]  Peter F. Driessen,et al.  Gesture-Based Affective Computing on Motion Capture Data , 2005, ACII.

[7]  Enas Abdulhay,et al.  Using Deep Convolutional Neural Network for Emotion Detection on a Physiological Signals Dataset (AMIGOS) , 2019, IEEE Access.

[8]  Johannes Wagner,et al.  From Physiological Signals to Emotions: Implementing and Comparing Selected Methods for Feature Extraction and Classification , 2005, 2005 IEEE International Conference on Multimedia and Expo.

[9]  Marco Maier,et al.  DeepFlow: Detecting Optimal User Experience From Physiological Data Using Deep Neural Networks , 2019, AAMAS.

[10]  Brandon G. King,et al.  Facial Features for Affective State Detection in Learning Environments , 2007 .

[11]  Matti Pietikäinen,et al.  Multimodal emotion recognition by combining physiological signals and facial expressions: A preliminary study , 2012, 2012 Annual International Conference of the IEEE Engineering in Medicine and Biology Society.

[12]  Andrea Bonarini,et al.  Modeling enjoyment preference from physiological responses in a car racing game , 2010, Proceedings of the 2010 IEEE Conference on Computational Intelligence and Games.

[13]  John L. Andreassi,et al.  Psychophysiology: Human Behavior & Physiological Response , 2000 .

[14]  R. Plutchik The Nature of Emotions , 2001 .

[15]  Nitish Srivastava,et al.  Dropout: a simple way to prevent neural networks from overfitting , 2014, J. Mach. Learn. Res..

[16]  Tieniu Tan,et al.  Affective Information Processing , 2008 .

[17]  Elisabeth André,et al.  Emotion recognition based on physiological changes in music listening , 2008, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[18]  Oh-Wook Kwon,et al.  EMOTION RECOGNITION BY SPEECH SIGNAL , 2003 .

[19]  Fabien Ringeval,et al.  End-to-end learning for dimensional emotion recognition from physiological signals , 2017, 2017 IEEE International Conference on Multimedia and Expo (ICME).

[20]  Thomas Jürgensohn,et al.  Comparing Two Emotion Models for Deriving Affective States from Physiological Data , 2008, Affect and Emotion in Human-Computer Interaction.

[21]  Chao Li,et al.  Analysis of physiological for emotion recognition with the IRS model , 2016, Neurocomputing.

[22]  Andrea Pinna,et al.  Physiological-Based Emotion Detection and Recognition in a Video Game Context , 2018, 2018 International Joint Conference on Neural Networks (IJCNN).

[23]  Vangelis Metsis,et al.  Classification of Emotional Arousal During Multimedia Exposure , 2017, PETRA.

[24]  Omar AlZoubi,et al.  Affect Detection from Multichannel Physiology during Learning Sessions with AutoTutor , 2011, AIED.

[25]  Lan Li,et al.  Emotion Recognition Using Physiological Signals from Multiple Subjects , 2006, 2006 International Conference on Intelligent Information Hiding and Multimedia.

[26]  Christos D. Katsis,et al.  A User Independent, Biosignal Based, Emotion Recognition Method , 2007, User Modeling.

[27]  Rosalind W. Picard Affective Computing , 1997 .

[28]  Christian Peter,et al.  Physiological sensing for affective computing , 2009, Affective Information Processing.

[29]  Andrea Bonarini,et al.  Enjoyment recognition from physiological data in a car racing game , 2010, AFFINE '10.

[30]  Sylvia D. Kreibig,et al.  An affective computing approach to physiological emotion specificity: toward subject-independent and stimulus-independent classification of film-induced emotions. , 2011, Psychophysiology.

[31]  Mariusz Szwoch,et al.  Facial emotion recognition using depth data , 2015, 2015 8th International Conference on Human System Interaction (HSI).

[32]  Guillaume Chanel,et al.  User Evaluation of Affective Dynamic Difficulty Adjustment Based on Physiological Deep Learning , 2020, HCI.

[33]  Carlos Carrascosa,et al.  Using Non-invasive Wearables for Detecting Emotions with Intelligent Agents , 2016, SOCO-CISIS-ICEUTE.

[34]  Sylvia D. Kreibig,et al.  Autonomic nervous system activity in emotion: A review , 2010, Biological Psychology.

[35]  Guillaume Chanel,et al.  Emotion Assessment From Physiological Signals for Adaptation of Game Difficulty , 2011, IEEE Transactions on Systems, Man, and Cybernetics - Part A: Systems and Humans.

[36]  Tieniu Tan,et al.  Affective Computing: A Review , 2005, ACII.

[37]  Jennifer Healey,et al.  Toward Machine Emotional Intelligence: Analysis of Affective Physiological State , 2001, IEEE Trans. Pattern Anal. Mach. Intell..

[38]  Ingo Siegert,et al.  Inter-rater reliability for emotion annotation in human–computer interaction: comparison and methodological improvements , 2013, Journal on Multimodal User Interfaces.

[39]  Maria E. Jabon,et al.  Real-time classification of evoked emotions using facial feature tracking and physiological responses , 2008, Int. J. Hum. Comput. Stud..

[40]  Jeffrey A. Falke,et al.  FiCli, the Fish and Climate Change Database, informs climate adaptation and management for freshwater fishes , 2020, Scientific Data.

[41]  Davide Fossati,et al.  Affect detection from non-stationary physiological data using ensemble classifiers , 2014, Evolving Systems.

[42]  Ryohei Nakatsu,et al.  Emotion Recognition in Speech Using Neural Networks , 2000, Neural Computing & Applications.

[43]  Ya Li,et al.  Long Short Term Memory Recurrent Neural Network based Multimodal Dimensional Emotion Recognition , 2015, AVEC@ACM Multimedia.

[44]  Carlos Busso,et al.  MSP-IMPROV: An Acted Corpus of Dyadic Interactions to Study Emotion Perception , 2017, IEEE Transactions on Affective Computing.

[45]  Johannes Wagner,et al.  Integrating information from speech and physiological signals to achieve emotional sensitivity , 2005, INTERSPEECH.

[46]  Rafael A. Calvo,et al.  Automated Detection of Engagement Using Video-Based Estimation of Facial Expressions and Heart Rate , 2017, IEEE Transactions on Affective Computing.

[47]  Elisabeth André,et al.  Towards user-independent classification of multimodal emotional signals , 2009, 2009 3rd International Conference on Affective Computing and Intelligent Interaction and Workshops.

[48]  Changchun Liu,et al.  An empirical study of machine learning techniques for affect recognition in human–robot interaction , 2006, 2005 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[49]  Arthur C. Graesser,et al.  Emote aloud during learning with AutoTutor: Applying the Facial Action Coding System to cognitive–affective states during learning , 2008 .

[50]  Loïc Kessous,et al.  Multimodal emotion recognition from expressive faces, body gestures and speech , 2007, AIAI.

[51]  Neelu Khare,et al.  An efficient XGBoost–DNN-based classification model for network intrusion detection system , 2020, Neural Computing and Applications.

[52]  Rafael A. Calvo,et al.  Automatic natural expression recognition using head movement and skin color features , 2012, AVI.

[53]  Rafael A. Calvo,et al.  Classification of affects using head movement, skin color features and physiological signals , 2012, 2012 IEEE International Conference on Systems, Man, and Cybernetics (SMC).

[54]  Mauricio A. Álvarez,et al.  Feature selection for multimodal emotion recognition in the arousal-valence space , 2013, 2013 35th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC).

[55]  Tianqi Chen,et al.  XGBoost: A Scalable Tree Boosting System , 2016, KDD.

[56]  Michelle Taub,et al.  Emotion recognition with facial expressions and physiological signals , 2017, 2017 IEEE Symposium Series on Computational Intelligence (SSCI).

[57]  Francesca Odone,et al.  Real-time Automatic Emotion Recognition from Body Gestures , 2014, ArXiv.

[58]  K. H. Kim,et al.  Emotion recognition system using short-term monitoring of physiological signals , 2004, Medical and Biological Engineering and Computing.

[59]  P. Ekman,et al.  The nature of emotion: Fundamental questions. , 1994 .

[60]  E. Hudlicka AFFECTIVE COMPUTING FOR GAME DESIGN , 2008 .

[61]  Masayuki Numao,et al.  Towards the Design of Affective Survival Horror Games: An Investigation on Player Affect , 2013, 2013 Humaine Association Conference on Affective Computing and Intelligent Interaction.

[62]  Sonia H. Contreras Ortiz,et al.  A machine learning model for emotion recognition from physiological signals , 2020, Biomed. Signal Process. Control..

[63]  Rosalind W. Picard Affective Computing for HCI , 1999, HCI.

[64]  Dong Yu,et al.  Deep Learning: Methods and Applications , 2014, Found. Trends Signal Process..

[65]  J. Russell A circumplex model of affect. , 1980 .

[66]  Rafael A. Calvo,et al.  Detecting Naturalistic Expressions of Nonbasic Affect Using Physiological Signals , 2012, IEEE Transactions on Affective Computing.

[67]  Scotty D. Craig,et al.  Affect and learning: An exploratory look into the role of affect in learning with AutoTutor , 2004 .

[68]  P. Ekman Are there basic emotions? , 1992, Psychological review.

[69]  M. Bradley,et al.  Measuring emotion: the Self-Assessment Manikin and the Semantic Differential. , 1994, Journal of behavior therapy and experimental psychiatry.

[70]  Sriparna Saha,et al.  A study on emotion recognition from body gestures using Kinect sensor , 2014, 2014 International Conference on Communication and Signal Processing.

[71]  Katarzyna Wac,et al.  Individuals’ Stress Assessment Using Human-Smartphone Interaction Analysis , 2018, IEEE Transactions on Affective Computing.

[72]  Guillaume Chanel,et al.  Boredom, engagement and anxiety as indicators for adaptation to difficulty in games , 2008, MindTrek '08.

[73]  Rafael A. Calvo,et al.  A Dynamic Approach for Detecting Naturalistic Affective States from Facial Videos during HCI , 2012, Australasian Conference on Artificial Intelligence.

[74]  Bin Yang,et al.  Emotion recognition from speech signals using new harmony features , 2010, Signal Process..

[75]  Alice H. Oh,et al.  K-EmoCon, a multimodal sensor dataset for continuous emotion recognition in naturalistic conversations , 2020, Scientific Data.

[76]  Kai Juan Wong,et al.  Analysis of physiological responses from multiple subjects for emotion recognition , 2012, 2012 IEEE 14th International Conference on e-Health Networking, Applications and Services (Healthcom).

[77]  Georgios N. Yannakakis,et al.  Generic Physiological Features as Predictors of Player Experience , 2011, ACII.

[78]  Jonghwa Kim,et al.  Bimodal Emotion Recognition using Speech and Physiological Changes , 2007 .

[79]  P. Lang International Affective Picture System (IAPS) : Technical Manual and Affective Ratings , 1995 .

[80]  Regan L. Mandryk,et al.  A fuzzy physiological approach for continuously modeling emotion during interaction with play technologies , 2007, Int. J. Hum. Comput. Stud..

[81]  Qin Li,et al.  Wearable Emotion Recognition Using Heart Rate Data from a Smart Bracelet , 2020, Sensors.