Emotion recognition based on convolutional neural networks and heterogeneous bio-signal data sources

Abstract Emotion recognition is a crucial application in human–computer interaction. It is usually conducted using facial expressions as the main modality, which might not be reliable. In this study, we proposed a multimodal approach that uses 2-channel electroencephalography (EEG) signals and eye modality in addition to the face modality to enhance the recognition performance. We also studied the use of facial images versus facial depth as the face modality and adapted the common arousal–valence model of emotions and the convolutional neural network, which can model the spatiotemporal information from the modality data for emotion recognition. Extensive experiments were conducted on the modality and emotion data, the results of which showed that our system has high accuracies of 67.8% and 77.0% in valence recognition and arousal recognition, respectively. The proposed method outperformed most state-of-the-art systems that use similar but fewer modalities. Moreover, the use of facial depth has outperformed the use of facial images. The proposed method of emotion recognition has significant potential for integration into various educational applications.

[1]  Olga Sourina,et al.  Real-time EEG-based emotion monitoring using stable features , 2015, The Visual Computer.

[2]  Sebastian Ruder,et al.  An overview of gradient descent optimization algorithms , 2016, Vestnik komp'iuternykh i informatsionnykh tekhnologii.

[3]  Enas Abdulhay,et al.  Using Deep Convolutional Neural Network for Emotion Detection on a Physiological Signals Dataset (AMIGOS) , 2019, IEEE Access.

[4]  Jianjun Lei,et al.  Loss Functions of Generative Adversarial Networks (GANs): Opportunities and Challenges , 2020, IEEE Transactions on Emerging Topics in Computational Intelligence.

[5]  Alex Martin,et al.  Facial Emotion Recognition in Autism Spectrum Disorders: A Review of Behavioral and Neuroimaging Studies , 2010, Neuropsychology Review.

[6]  P. Ekman,et al.  The nature of emotion: Fundamental questions. , 1994 .

[7]  Reinhard Pakrun,et al.  EMOTIONS AND LEARNING 1 , 2013 .

[8]  Takeo Kanade,et al.  Comprehensive database for facial expression analysis , 2000, Proceedings Fourth IEEE International Conference on Automatic Face and Gesture Recognition (Cat. No. PR00580).

[9]  Rama Chellappa,et al.  FaceNet2ExpNet: Regularizing a Deep Face Recognition Net for Expression Recognition , 2016, 2017 12th IEEE International Conference on Automatic Face & Gesture Recognition (FG 2017).

[10]  Muhammad Ghulam,et al.  User emotion recognition from a larger pool of social network data using active learning , 2017, Multimedia Tools and Applications.

[11]  Reda A. El-Khoribi,et al.  EEG-Based Emotion Recognition using 3D Convolutional Neural Networks , 2018 .

[12]  Geoffrey E. Hinton,et al.  ImageNet classification with deep convolutional neural networks , 2012, Commun. ACM.

[13]  Yifei Lu,et al.  Combining Eye Movements and EEG to Enhance Emotion Recognition , 2015, IJCAI.

[14]  Manolis Mavrikis,et al.  Affective learning: improving engagement and enhancing learning with affect-aware feedback , 2017, User Modeling and User-Adapted Interaction.

[15]  Václav Snášel,et al.  Neural Networks for Emotion Recognition Based on Eye Tracking Data , 2015, 2015 IEEE International Conference on Systems, Man, and Cybernetics.

[16]  Andrew Zisserman,et al.  Very Deep Convolutional Networks for Large-Scale Image Recognition , 2014, ICLR.

[17]  Roddy Cowie,et al.  Describing the emotional states that are expressed in speech , 2003, Speech Commun..

[18]  J. Russell A circumplex model of affect. , 1980 .

[19]  Thierry Pun,et al.  Multimodal Emotion Recognition in Response to Videos , 2012, IEEE Transactions on Affective Computing.

[20]  Jeremy R. Cooperstock,et al.  Biosignals Analysis and its Application in a Performance Setting - Towards the Development of an Emotional-Imaging Generator , 2008, BIOSIGNALS.

[21]  Manuchehr Soleimani,et al.  Medical imaging and physiological modelling: linking physics and biology , 2009, Biomedical engineering online.

[22]  Jiahui Pan,et al.  Combining Facial Expressions and Electroencephalography to Enhance Emotion Recognition , 2019, Future Internet.

[23]  Yoshua Bengio,et al.  Understanding the difficulty of training deep feedforward neural networks , 2010, AISTATS.

[24]  Erik Cambria,et al.  Recent Trends in Deep Learning Based Natural Language Processing , 2017, IEEE Comput. Intell. Mag..

[25]  K. Scherer,et al.  Appraisal processes in emotion: Theory, methods, research. , 2001 .

[26]  Mohammad H. Mahoor,et al.  Going deeper in facial expression recognition using deep neural networks , 2015, 2016 IEEE Winter Conference on Applications of Computer Vision (WACV).

[27]  Stefan Winkler,et al.  Deep Learning for Emotion Recognition on Small Datasets using Transfer Learning , 2015, ICMI.

[28]  Tamás D. Gedeon,et al.  Collecting Large, Richly Annotated Facial-Expression Databases from Movies , 2012, IEEE MultiMedia.

[29]  Ioannis Pitas,et al.  The eNTERFACE’05 Audio-Visual Emotion Database , 2006, 22nd International Conference on Data Engineering Workshops (ICDEW'06).

[30]  David J. Finch,et al.  Managing emotions: A case study exploring the relationship between experiential learning, emotions, and student performance , 2015 .

[31]  Xiao-Li Meng,et al.  The Art of Data Augmentation , 2001 .

[32]  Chalavadi Krishna Mohan,et al.  Facial Expression Recognition Using Kinect Depth Sensor and Convolutional Neural Networks , 2014, 2014 13th International Conference on Machine Learning and Applications.

[33]  R. Nagarajan,et al.  Combining Spatial Filtering and Wavelet Transform for Classifying Human Emotions Using EEG Signals , 2011 .

[34]  M. Bradley,et al.  The pupil as a measure of emotional arousal and autonomic activation. , 2008, Psychophysiology.

[35]  N. P. Reddy,et al.  Facial expression (mood) recognition from facial images using committee neural networks , 2009, Biomedical engineering online.

[36]  Li Li,et al.  Emotion recognition based on the sample entropy of EEG. , 2014, Bio-medical materials and engineering.

[37]  Haoran Xie,et al.  Sentiment strength detection with a context-dependent lexicon-based convolutional neural network , 2020, Inf. Sci..

[38]  Nitin Kumar,et al.  Bispectral Analysis of EEG for Emotion Recognition , 2015, IHCI.

[39]  Dawei Song,et al.  Robust multi-source adaptation visual classification using supervised low-rank representation , 2017, Pattern Recognit..

[40]  P. Lang Behavioral treatment and bio-behavioral assessment: computer applications , 1980 .

[41]  Jyoti Kumari,et al.  Facial Expression Recognition: A Survey , 2015 .

[42]  Olga Sourina,et al.  A Fractal-based Algorithm of Emotion Recognition from EEG using Arousal-Valence Model , 2011, BIOSIGNALS.

[43]  Jakob de Lemos,et al.  Measuring emotions using eye tracking , 2008 .

[44]  Samit Bhattacharya,et al.  Using Deep and Convolutional Neural Networks for Accurate Emotion Classification on DEAP Dataset , 2017, AAAI.

[45]  Hong Zhang,et al.  Facial expression recognition via learning deep sparse autoencoders , 2018, Neurocomputing.

[46]  Nitish Srivastava,et al.  Dropout: a simple way to prevent neural networks from overfitting , 2014, J. Mach. Learn. Res..

[47]  Raymond Y. K. Lau,et al.  Universal affective model for Readers' emotion classification over short texts , 2018, Expert Syst. Appl..

[48]  Seyed Kamaledin Setarehdan,et al.  Emotion Classification through Nonlinear EEG Analysis Using Machine Learning Methods , 2018, International Clinical Neuroscience Journal.

[49]  Reza Rostami,et al.  Classifying depression patients and normal subjects using machine learning techniques , 2011, 2011 19th Iranian Conference on Electrical Engineering.

[50]  Takeo Kanade,et al.  The Extended Cohn-Kanade Dataset (CK+): A complete dataset for action unit and emotion-specified expression , 2010, 2010 IEEE Computer Society Conference on Computer Vision and Pattern Recognition - Workshops.

[51]  Rui Li,et al.  Classification of Five Emotions from EEG and Eye Movement Signals: Complementary Representation Properties , 2019, 2019 9th International IEEE/EMBS Conference on Neural Engineering (NER).

[52]  Erik Cambria,et al.  The Hourglass Model Revisited , 2020, IEEE Intelligent Systems.

[53]  Suhua Zhang,et al.  An approach to EEG-based emotion recognition using combined feature extraction method , 2016, Neuroscience Letters.

[54]  Jun Yu,et al.  Deep Neural Networks with Relativity Learning for facial expression recognition , 2016, 2016 IEEE International Conference on Multimedia & Expo Workshops (ICMEW).

[55]  P. Lang International affective picture system (IAPS) : affective ratings of pictures and instruction manual , 2005 .

[56]  Shiting Wen,et al.  L1-norm locally linear representation regularization multi-source adaptation learning , 2015, Neural Networks.

[57]  Soonja Yeom,et al.  Facial expression recognition using a multi-level convolutional neural network , 2018, ICPR 2018.

[58]  Geoffrey E. Hinton,et al.  Rectified Linear Units Improve Restricted Boltzmann Machines , 2010, ICML.

[59]  Yoshua Bengio,et al.  Convolutional networks for images, speech, and time series , 1998 .

[60]  Seyed Kamaledin Setarehdan,et al.  A NOVEL METHOD OF EEG-BASED EMOTION RECOGNITION USING NONLINEAR FEATURES VARIABILITY AND DEMPSTER–SHAFER THEORY , 2018, Biomedical Engineering: Applications, Basis and Communications.

[61]  Shiting Wen,et al.  Multi-source adaptation joint kernel sparse representation for visual classification , 2016, Neural Networks.

[62]  Oleg V. Komogortsev,et al.  Standardization of Automated Analyses of Oculomotor Fixation and Saccadic Behaviors , 2010, IEEE Transactions on Biomedical Engineering.

[63]  Soo-Hyung Kim,et al.  Emotion Recognition by Integrating Eye Movement Analysis and Facial Expression Model , 2019, ICMLSC 2019.

[64]  Paris Smaragdis,et al.  Deep learning for monaural speech separation , 2014, 2014 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP).

[65]  Bao-Liang Lu,et al.  Classification of Five Emotions from EEG and Eye Movement Signals: Discrimination Ability and Stability over Time , 2019, 2019 9th International IEEE/EMBS Conference on Neural Engineering (NER).

[66]  Daniel S. Kermany,et al.  Identifying Medical Diagnoses and Treatable Diseases by Image-Based Deep Learning , 2018, Cell.

[67]  Kai Keng Ang,et al.  ERNN: A Biologically Inspired Feedforward Neural Network to Discriminate Emotion From EEG Signal , 2014, IEEE Transactions on Neural Networks and Learning Systems.

[68]  Sunil Semwal,et al.  A Review of Eeg Sensors used for Data Acquisition , 2012 .

[69]  Qingmei Yao,et al.  Multi-Sensory Emotion Recognition with Speech and Facial Expression , 2014 .

[70]  Hao Guo,et al.  Novel Algorithm for Measuring the Complexity of Electroencephalographic Signals in Emotion Recognition , 2017 .

[71]  Bao-Liang Lu,et al.  Emotional state classification from EEG data using machine learning approach , 2014, Neurocomputing.

[72]  David F. Nichols,et al.  Are Electrode Caps Worth the Investment? An Evaluation of EEG Methods in Undergraduate Neuroscience Laboratory Courses and Research. , 2016, Journal of undergraduate neuroscience education : JUNE : a publication of FUN, Faculty for Undergraduate Neuroscience.

[73]  Jiahui Pan,et al.  Fusion of Facial Expressions and EEG for Multimodal Emotion Recognition , 2017, Comput. Intell. Neurosci..

[74]  Amos J. Storkey,et al.  Data Augmentation Generative Adversarial Networks , 2017, ICLR 2018.

[75]  Alberto J. Molina-Cantero,et al.  Emotions Detection based on a Single-electrode EEG Device , 2017, PhyCS.