Classification of Five Emotions from EEG and Eye Movement Signals: Complementary Representation Properties

Recently, various multimodal approaches to enhancing the performance of affective models have been developed. In this paper, we investigate the complementary representation properties of EEG and eye movement signals on classification for five human emotions: happy, sad, fear, disgust, and neutral. We compare the performance of single modality and two different modality fusion approaches. The results indicate that EEG is superior to eye movements in classifying happy, sad and disgust emotions, whereas eye movements outperform EEG in recognizing fear and neutral emotions. Compared with eye movements, EEG has the advantage of classifying the five emotions, with the mean accuracies of 69.50% and 59.81%, respectively. Due to the complementary representation properties, the modality fusion with bimodal deep auto-encoder significantly improves the classification accuracy to 79.71%. Furthermore, we study the neural patterns of five emotion states and the recognition performance of different eye movement features. The results reveal that five emotions have distinguishable neural patterns and pupil diameter has a relatively high discrimination ability than the other eye movement features.

[1]  P. Ekman,et al.  Constants across cultures in the face and emotion. , 1971, Journal of personality and social psychology.

[2]  Bao-Liang Lu,et al.  Multimodal emotion recognition using EEG and eye tracking data , 2014, 2014 36th Annual International Conference of the IEEE Engineering in Medicine and Biology Society.

[3]  Bao-Liang Lu,et al.  Differential entropy feature for EEG-based emotion classification , 2013, 2013 6th International IEEE/EMBS Conference on Neural Engineering (NER).

[4]  A. Schaefer,et al.  Please Scroll down for Article Cognition & Emotion Assessing the Effectiveness of a Large Database of Emotion-eliciting Films: a New Tool for Emotion Researchers , 2022 .

[5]  Rafael A. Calvo,et al.  Affect Detection: An Interdisciplinary Review of Models, Methods, and Their Applications , 2010, IEEE Transactions on Affective Computing.

[6]  Wei Liu,et al.  Emotion Recognition Using Multimodal Deep Learning , 2016, ICONIP.

[7]  Thierry Pun,et al.  Multimodal Emotion Recognition in Response to Videos , 2012, IEEE Transactions on Affective Computing.

[8]  Bao-Liang Lu,et al.  Classification of Five Emotions from EEG and Eye Movement Signals: Discrimination Ability and Stability over Time , 2019, 2019 9th International IEEE/EMBS Conference on Neural Engineering (NER).

[9]  M. Bradley,et al.  The pupil as a measure of emotional arousal and autonomic activation. , 2008, Psychophysiology.

[10]  Albino Nogueiras,et al.  Speech emotion recognition using hidden Markov models , 2001, INTERSPEECH.

[11]  R. Hari,et al.  Discrete Neural Signatures of Basic Emotions. , 2016, Cerebral cortex.

[12]  Yifei Lu,et al.  Combining Eye Movements and EEG to Enhance Emotion Recognition , 2015, IJCAI.

[13]  Andrzej Cichocki,et al.  EmotionMeter: A Multimodal Framework for Recognizing Human Emotions , 2019, IEEE Transactions on Cybernetics.

[14]  Terence W. Picton,et al.  Ocular artifacts in recording EEGs and event-related potentials II: Source dipoles and source components , 2005, Brain Topography.