Feature-level fusion of multimodal physiological signals for emotion recognition

Objective: This paper aims to use multimodal physiological signals to automatically recognize human emotions, and a novel multimodal feature fusion approach is proposed. Methods: In the proposed approach, significant multimodal features are selected respectively by two comparative feature selection methods: Fisher Criterion Score and Davies-Bouldin index. Emotion recognition is performed on the valence-arousal emotion space by using hidden Markov models (HMMs) and multimodal feature sets. Four physiological modalities, including electroencephalogram (EEG) from central nervous system and peripheral physiological signals (PERI) from peripheral nervous system as shown in the DEAP database, are employed. Results: We show the best recognition accuracies of 85.63% for arousal and 83.98% for valence. The proposed feature fusion approach is compared with decision-level fusion and non-fusion approaches on the same database; and the comparison demonstrates significant improvements in accuracy obtained by the feature fusion approach. Conclusion: Our work supports the observation that the proposed feature-level fusion approach represents a promising methodology for emotion recognition.

[1]  Jason Williams,et al.  Emotion Recognition Using Bio-sensors: First Steps towards an Automatic System , 2004, ADS.

[2]  Shanbao Tong,et al.  Advances in quantitative electroencephalogram analysis methods. , 2004, Annual review of biomedical engineering.

[3]  Richard O. Duda,et al.  Pattern classification and scene analysis , 1974, A Wiley-Interscience publication.

[4]  P. Vijaya,et al.  Emotion Recognition Using Finger Tip Temperature: First Step towards an Automatic System , 2012 .

[5]  W. Sommer,et al.  Facial EMG Responses to Emotional Expressions Are Related to Emotion Perception Ability , 2014, PloS one.

[6]  Fang Chen,et al.  A New Measurement of Complexity for Studying EEG Mutual Information , 1998, ICONIP.

[7]  P ? ? ? ? ? ? ? % ? ? ? ? , 1991 .

[8]  David G. Stork,et al.  Pattern Classification , 1973 .

[9]  E. Scilingo,et al.  Mood states modulate complexity in heartbeat dynamics: A multiscale entropy analysis , 2014 .

[10]  V. Knott,et al.  EEG power, frequency, asymmetry and coherence in male depression , 2001, Psychiatry Research: Neuroimaging.

[11]  Gyanendra K. Verma,et al.  Multimodal fusion framework: A multiresolution approach for emotion classification and recognition from physiological signals , 2014, NeuroImage.

[12]  Guillaume Chanel,et al.  Emotion Assessment: Arousal Evaluation Using EEG's and Peripheral Physiological Signals , 2006, MRCS.

[13]  Donald W. Bouldin,et al.  A Cluster Separation Measure , 1979, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[14]  Minho Lee,et al.  Emotion recognition based on 3D fuzzy visual and EEG features in movie clips , 2014, Neurocomputing.

[15]  Johannes Wagner,et al.  From Physiological Signals to Emotions: Implementing and Comparing Selected Methods for Feature Extraction and Classification , 2005, 2005 IEEE International Conference on Multimedia and Expo.

[16]  B. Hjorth EEG analysis based on time domain properties. , 1970, Electroencephalography and clinical neurophysiology.

[17]  Robert Frysch,et al.  Hidden Markov model and support vector machine based decoding of finger movements using electrocorticography , 2013, Journal of neural engineering.

[18]  Lawrence R. Rabiner,et al.  A tutorial on hidden Markov models and selected applications in speech recognition , 1989, Proc. IEEE.

[19]  D. Rubin,et al.  Maximum likelihood from incomplete data via the EM - algorithm plus discussions on the paper , 1977 .