Comparative analysis of physiological signals and electroencephalogram (EEG) for multimodal emotion recognition using generative models

Multimodal Emotion recognition (MER) is an application of machine learning were different biological signals are used in order to automatically classify a determined affective state. MER systems has been developed for different type of applications from psychological evaluation, anxiety assessment, human-machine interfaces and marketing. There are several spaces of classification proposed in the state of art for the emotion recognition task, the most known are discrete and dimensional spaces were the emotions are described in terms of some basic emotions and latent dimensions respectively. The use of dimensional spaces of classification allows a higher range of emotional states to be analyzed. The most common dimensional space used for this purpose is the Arousal/Valence space were emotions are described in terms of the intensity of the emotion that goes from inactive to active in the arousal dimension, and from unpleasant to pleasant in the valence dimension. The use of physiological signals and the EEG is well suited for emotion recognition due to the fact that an emotional states generates responses from different biological systems of the human body. Since the expression of an emotion is a dynamic process, we propose the use of generative models as Hidden Markov Models (HMM) to capture de dynamics of the signals for further classification of emotional states in terms of arousal and valence. For the development of this work an international database for emotion classification known as Dataset for Emotion Analysis using Physiological signals (DEAP) is used. The objective of this work is to determine which of the physiological and EEG signals brings more relevant information in the emotion recognition task, several experiments using HMMs from different signals and combinations of them are performed, and the results shows that some of those signals brings more discrimination between arousal and valence levels as the EEG and the Galvanic Skin Response (GSR) and the Heart rate (HR).

[1]  S. Tokuno,et al.  Usage of emotion recognition in military health care , 2011, 2011 Defense Science Research Conference and Expo (DSR).

[2]  Alan Hanjalic,et al.  Affective video content representation and modeling , 2005, IEEE Transactions on Multimedia.

[3]  Guoyun Lv,et al.  Visual emotion recognition based on dynamic models , 2013, 2013 IEEE International Conference on Signal Processing, Communication and Computing (ICSPCC 2013).

[4]  Yunde Jia,et al.  Audio-visual emotion recognition with boosted coupled HMM , 2012, Proceedings of the 21st International Conference on Pattern Recognition (ICPR2012).

[5]  Hatice Gunes,et al.  Continuous Prediction of Spontaneous Affect from Multiple Cues and Modalities in Valence-Arousal Space , 2011, IEEE Transactions on Affective Computing.

[6]  Thierry Pun,et al.  Multimodal Emotion Recognition in Response to Videos , 2012, IEEE Transactions on Affective Computing.

[7]  Wang Jian,et al.  Speech emotion recognition based on wavelet transform and improved HMM , 2013, 2013 25th Chinese Control and Decision Conference (CCDC).

[8]  Chung-Hsien Wu,et al.  Two-Level Hierarchical Alignment for Semi-Coupled HMM-Based Audiovisual Emotion Recognition With Temporal Course , 2013, IEEE Transactions on Multimedia.

[9]  Andreas Wendemuth,et al.  Determining optimal signal features and parameters for HMM-based emotion classification , 2010, Melecon 2010 - 2010 15th IEEE Mediterranean Electrotechnical Conference.

[10]  Yimo Guo,et al.  Emotion Recognition System in Images Based On Fuzzy Neural Network and HMM , 2006, 2006 5th IEEE International Conference on Cognitive Informatics.

[11]  Kazuhiko Takahashi,et al.  Computational emotion recognition using multimodal physiological signals: Elicited using Japanese kanji words , 2012, 2012 35th International Conference on Telecommunications and Signal Processing (TSP).

[12]  Thierry Pun,et al.  DEAP: A Database for Emotion Analysis ;Using Physiological Signals , 2012, IEEE Transactions on Affective Computing.

[13]  Ilia Uma physiological signals based human emotion recognition a review , 2014 .

[14]  Zhihong Zeng,et al.  A Survey of Affect Recognition Methods: Audio, Visual, and Spontaneous Expressions , 2009, IEEE Trans. Pattern Anal. Mach. Intell..

[15]  K. Scherer What are emotions? And how can they be measured? , 2005 .

[16]  Gang Wei,et al.  Speech emotion recognition based on HMM and SVM , 2005, 2005 International Conference on Machine Learning and Cybernetics.

[17]  Lawrence R. Rabiner,et al.  A tutorial on hidden Markov models and selected applications in speech recognition , 1989, Proc. IEEE.

[18]  P. Ekman Universals and cultural differences in facial expressions of emotion. , 1972 .

[19]  Mauricio A. Álvarez,et al.  Feature selection for multimodal emotion recognition in the arousal-valence space , 2013, 2013 35th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC).