EEG-based recognition of video-induced emotions: Selecting subject-independent feature set

Emotions are fundamental for everyday life affecting our communication, learning, perception, and decision making. Including emotions into the human-computer interaction (HCI) could be seen as a significant step forward offering a great potential for developing advanced future technologies. While the electrical activity of the brain is affected by emotions, offers electroencephalogram (EEG) an interesting channel to improve the HCI. In this paper, the selection of subject-independent feature set for EEG-based emotion recognition is studied. We investigate the effect of different feature sets in classifying person's arousal and valence while watching videos with emotional content. The classification performance is optimized by applying a sequential forward floating search algorithm for feature selection. The best classification rate (65.1% for arousal and 63.0% for valence) is obtained with a feature set containing power spectral features from the frequency band of 1-32 Hz. The proposed approach substantially improves the classification rate reported in the literature. In future, further analysis of the video-induced EEG changes including the topographical differences in the spectral features is needed.

[1]  Josef Kittler,et al.  Floating search methods in feature selection , 1994, Pattern Recognit. Lett..

[2]  Mohammad Soleymani,et al.  A Multimodal Database for Affect Recognition and Implicit Tagging , 2012, IEEE Transactions on Affective Computing.

[3]  Arnaud Delorme,et al.  EEGLAB: an open source toolbox for analysis of single-trial EEG dynamics including independent component analysis , 2004, Journal of Neuroscience Methods.

[4]  Björn W. Schuller,et al.  Emotion representation, analysis and synthesis in continuous space: A survey , 2011, Face and Gesture 2011.

[5]  Jukka Kortelainen,et al.  Classifier-based learning of nonlinear feature manifold for visualization of emotional speech prosody , 2013, IEEE Transactions on Affective Computing.

[6]  Jason Williams,et al.  Emotion Recognition Using Bio-sensors: First Steps towards an Automatic System , 2004, ADS.

[7]  Thierry Pun,et al.  Valence-arousal evaluation using physiological signals in an emotion recall paradigm , 2007, 2007 IEEE International Conference on Systems, Man and Cybernetics.

[8]  N. Fox,et al.  Asymmetrical brain activity discriminates between positive and negative affective stimuli in human infants. , 1982, Science.

[9]  K. Scherer,et al.  The World of Emotions is not Two-Dimensional , 2007, Psychological science.

[10]  M Murugappan,et al.  Physiological signals based human emotion Recognition: a review , 2011, 2011 IEEE 7th International Colloquium on Signal Processing and its Applications.

[11]  Matti Pietikäinen,et al.  Multimodal emotion recognition by combining physiological signals and facial expressions: A preliminary study , 2012, 2012 Annual International Conference of the IEEE Engineering in Medicine and Biology Society.

[12]  Amit Konar,et al.  Emotion Recognition From Facial Expressions and Its Control Using Fuzzy Logic , 2009, IEEE Transactions on Systems, Man, and Cybernetics - Part A: Systems and Humans.

[13]  Yuan-Pin Lin,et al.  Generalizations of the subject-independent feature set for music-induced emotion recognition , 2011, 2011 Annual International Conference of the IEEE Engineering in Medicine and Biology Society.

[14]  Antonio Camurri,et al.  Technique for automatic emotion recognition by body gesture analysis , 2008, 2008 IEEE Computer Society Conference on Computer Vision and Pattern Recognition Workshops.