EEG-based emotion recognition using 4D convolutional recurrent neural network

In this paper, we present a novel method, called four-dimensional convolutional recurrent neural network, which integrating frequency, spatial and temporal information of multichannel EEG signals explicitly to improve EEG-based emotion recognition accuracy. First, to maintain these three kinds of information of EEG, we transform the differential entropy features from different channels into 4D structures to train the deep model. Then, we introduce CRNN model, which is combined by convolutional neural network (CNN) and recurrent neural network with long short term memory (LSTM) cell. CNN is used to learn frequency and spatial information from each temporal slice of 4D inputs, and LSTM is used to extract temporal dependence from CNN outputs. The output of the last node of LSTM performs classification. Our model achieves state-of-the-art performance both on SEED and DEAP datasets under intra-subject splitting. The experimental results demonstrate the effectiveness of integrating frequency, spatial and temporal information of EEG for emotion recognition.

[1]  Stefan Haufe,et al.  The Berlin Brain-Computer Interface: Progress Beyond Communication and Control , 2016, Front. Neurosci..

[2]  Mahdi Bamdad,et al.  Application of BCI systems in neurorehabilitation: a scoping review , 2015, Disability and rehabilitation. Assistive technology.

[3]  Rohit Prasad,et al.  Robust EEG emotion classification using segment level decision fusion , 2013, 2013 IEEE International Conference on Acoustics, Speech and Signal Processing.

[4]  Charalampos Bratsas,et al.  Toward Emotion Aware Computing: An Integrated Approach Using Multichannel Neurophysiological Recordings and Affective Visual Stimuli , 2010, IEEE Transactions on Information Technology in Biomedicine.

[5]  Xiaowei Chen,et al.  Continuous Convolutional Neural Network with 3D Input for EEG-Based Emotion Recognition , 2018, ICONIP.

[6]  Jian Sun,et al.  Deep Residual Learning for Image Recognition , 2015, 2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR).

[7]  Anton Nijholt,et al.  Emotional brain-computer interfaces , 2009, 2009 3rd International Conference on Affective Computing and Intelligent Interaction and Workshops.

[8]  A. Nijholt,et al.  A survey of affective brain computer interfaces: principles, state-of-the-art, and challenges , 2014 .

[9]  Hamido Fujita,et al.  Unsupervised emotional state classification through physiological parameters for social robotics applications , 2020, Knowl. Based Syst..

[10]  Brendan McCane,et al.  EmotioNet: A 3-D Convolutional Neural Network for EEG-based Emotion Recognition , 2018, 2018 International Joint Conference on Neural Networks (IJCNN).

[11]  F Babiloni,et al.  Passive BCI beyond the lab: current trends and future directions , 2018, Physiological measurement.

[12]  Tong Zhang,et al.  Spatial–Temporal Recurrent Neural Network for Emotion Recognition , 2017, IEEE Transactions on Cybernetics.

[13]  Fabio Babiloni,et al.  How Neurophysiological Measures Can be Used to Enhance the Evaluation of Remote Tower Solutions , 2019, Front. Hum. Neurosci..

[14]  Fabio Babiloni,et al.  Brain–Computer Interfaces: Toward a Daily Life Employment , 2020, Brain sciences.

[15]  Soraia M. Alarcão,et al.  Emotions Recognition Using EEG Signals: A Survey , 2019, IEEE Transactions on Affective Computing.

[16]  Thierry Pun,et al.  DEAP: A Database for Emotion Analysis ;Using Physiological Signals , 2012, IEEE Transactions on Affective Computing.

[17]  Zhaoxiang Zhang,et al.  Hierarchical Convolutional Neural Networks for EEG-Based Emotion Recognition , 2017, Cognitive Computation.

[18]  Geoffrey E. Hinton,et al.  ImageNet classification with deep convolutional neural networks , 2012, Commun. ACM.

[19]  Ming Qiu,et al.  Emotion Recognition from Multi-Channel EEG through Parallel Convolutional Recurrent Neural Network , 2018, 2018 International Joint Conference on Neural Networks (IJCNN).

[20]  Brendan Z. Allison,et al.  The Hybrid BCI , 2010, Frontiers in Neuroscience.

[21]  Jürgen Schmidhuber,et al.  Long Short-Term Memory , 1997, Neural Computation.

[22]  Jiaming Zhang,et al.  EEG Emotion Classification Using an Improved SincNet-Based Deep Learning Model , 2019, Brain sciences.

[23]  M. Akin,et al.  Comparison of Wavelet Transform and FFT Methods in the Analysis of EEG Signals , 2002, Journal of Medical Systems.

[24]  Guojun Dai,et al.  EEG classification of driver mental states by deep learning , 2018, Cognitive Neurodynamics.

[25]  George N. Votsis,et al.  Emotion recognition in human-computer interaction , 2001, IEEE Signal Process. Mag..

[26]  Wenming Zheng,et al.  EEG Emotion Recognition Using Dynamical Graph Convolutional Neural Networks , 2020, IEEE Transactions on Affective Computing.

[27]  G. R. Figueiredo,et al.  Attentional bias for emotional faces in depressed and non-depressed individuals: an eye-tracking study , 2019, 2019 41st Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC).

[28]  Björn W. Schuller,et al.  Attention-augmented End-to-end Multi-task Learning for Emotion Prediction from Speech , 2019, ICASSP 2019 - 2019 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP).

[29]  Fabio Babiloni,et al.  A LightGBM-Based EEG Analysis Method for Driver Mental States Classification , 2019, Comput. Intell. Neurosci..

[30]  Bao-Liang Lu,et al.  Differential entropy feature for EEG-based emotion classification , 2013, 2013 6th International IEEE/EMBS Conference on Neural Engineering (NER).

[31]  Christian Mühl,et al.  Valence, arousal and dominance in the EEG during game play , 2013, Int. J. Auton. Adapt. Commun. Syst..

[32]  A. Goshvarpour,et al.  EEG spectral powers and source localization in depressing, sad, and fun music videos focusing on gender differences , 2018, Cognitive Neurodynamics.

[33]  Hao Tang,et al.  Emotion Recognition using Multimodal Residual LSTM Network , 2019, ACM Multimedia.

[34]  Yu-Liang Hsu,et al.  Automatic ECG-Based Emotion Recognition in Music Listening , 2020, IEEE Transactions on Affective Computing.

[35]  Jiajin Yuan,et al.  EEG oscillations reflect task effects for the change detection in vocal emotion , 2014, Cognitive Neurodynamics.

[36]  Mariska J Vansteensel,et al.  Brain-computer interfaces for communication. , 2020, Handbook of clinical neurology.

[37]  Fabio Babiloni,et al.  Mental workload estimations in unilateral deafened children , 2015, 2015 37th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC).

[38]  Fabio Babiloni,et al.  Assessment of driving fatigue based on intra/inter-region phase synchronization , 2017, Neurocomputing.

[39]  Thierry Pun,et al.  A channel selection method for EEG classification in emotion assessment based on synchronization likelihood , 2007, 2007 15th European Signal Processing Conference.

[40]  B. Fischl,et al.  Direct Visualization of the Perforant Pathway in the Human Brain with Ex Vivo Diffusion Tensor Imaging , 2010, Front. Hum. Neurosci..

[41]  Haibo Li,et al.  Sparse Kernel Reduced-Rank Regression for Bimodal Emotion Recognition From Facial Expression and Speech , 2016, IEEE Transactions on Multimedia.

[42]  Bao-Liang Lu,et al.  Identifying Stable Patterns over Time for Emotion Recognition from EEG , 2016, IEEE Transactions on Affective Computing.

[43]  Touradj Ebrahimi,et al.  EEG Correlates of Different Emotional States Elicited during Watching Music Videos , 2011, ACII.

[44]  Bao-Liang Lu,et al.  Investigating Critical Frequency Bands and Channels for EEG-Based Emotion Recognition with Deep Neural Networks , 2015, IEEE Transactions on Autonomous Mental Development.