Affective State Recognition through EEG Signals Feature Level Fusion and Ensemble Classifier

Human affects are complex paradox and an active research domain in affective computing. Affects are traditionally determined through a self-report based psychometric questionnaire or through facial expression recognition. However, few state-of-the-arts pieces of research have shown the possibilities of recognizing human affects from psychophysiological and neurological signals. In this article, electroencephalogram (EEG) signals are used to recognize human affects. The electroencephalogram (EEG) of 100 participants are collected where they are given to watch one-minute video stimuli to induce different affective states. The videos with emotional tags have a variety range of affects including happy, sad, disgust, and peaceful. The experimental stimuli are collected and analyzed intensively. The interrelationship between the EEG signal frequencies and the ratings given by the participants are taken into consideration for classifying affective states. Advanced feature extraction techniques are applied along with the statistical features to prepare a fused feature vector of affective state recognition. Factor analysis methods are also applied to select discriminative features. Finally, several popular supervised machine learning classifier is applied to recognize different affective states from the discriminative feature vector. Based on the experiment, the designed random forest classifier produces 89.06% accuracy in classifying four basic affective states.

[1]  Liang Dong,et al.  Emotion Recognition from Multiband EEG Signals Using CapsNet , 2019, Sensors.

[2]  Chi Zhang,et al.  Emotion Recognition from EEG Signals Using Multidimensional Information in EMD Domain , 2017, BioMed research international.

[3]  A. Damasio,et al.  Emotion in the perspective of an integrated nervous system 1 Published on the World Wide Web on 27 January 1998. 1 , 1998, Brain Research Reviews.

[4]  Giancarlo Fortino,et al.  Human emotion recognition using deep belief network architecture , 2019, Inf. Fusion.

[5]  Choong Seon Hong,et al.  Healthcare IoT-Based Affective State Mining Using a Deep Convolutional Neural Network , 2019, IEEE Access.

[6]  Arif Mahmood,et al.  Internal Emotion Classification Using EEG Signal With Sparse Discriminative Ensemble , 2019, IEEE Access.

[7]  Yang Liu,et al.  A Multi-Task Learning Framework for Emotion Recognition Using 2D Continuous Space , 2017, IEEE Transactions on Affective Computing.

[8]  Shizhe Chen,et al.  Multimodal Multi-task Learning for Dimensional and Continuous Emotion Recognition , 2017, AVEC@ACM Multimedia.

[9]  R. Plutchik Human emotions have deep evolutionary roots, a fact that may explain their complexity and provide tools for clinical practice , 2016 .

[10]  Javad Frounchi,et al.  Wavelet-based emotion recognition system using EEG signal , 2017, Neural Computing and Applications.

[11]  Youjun Li,et al.  Human Emotion Recognition with Electroencephalographic Multidimensional Features by Hybrid Deep Neural Networks , 2017 .

[12]  Thierry Pun,et al.  DEAP: A Database for Emotion Analysis ;Using Physiological Signals , 2012, IEEE Transactions on Affective Computing.

[13]  J. Russell,et al.  The circumplex model of affect: An integrative approach to affective neuroscience, cognitive development, and psychopathology , 2005, Development and Psychopathology.

[14]  Samit Bhattacharya,et al.  Using Deep and Convolutional Neural Networks for Accurate Emotion Classification on DEAP Dataset , 2017, AAAI.