Bimodal System for Emotion Recognition from Facial Expressions and Physiological Signals Using Feature-Level Fusion

This paper presents an automatic approach for emotion recognition from a bimodal system based on facial expressions and physiological signals. The information fusion is to combine information from both modalities. We tested two approaches, one based on mutual information which allows the selection of relevant information, the second approach is based on principal component analysis that allows the transformation of data into another space. The obtained results using both modalities are better compared to the separate use of each modality.

[1]  W. Cannon The James-Lange theory of emotions: a critical examination and an alternative theory. By Walter B. Cannon, 1927. , 1927, American Journal of Psychology.

[2]  Paul A. Viola,et al.  Robust Real-Time Face Detection , 2001, International Journal of Computer Vision.

[3]  Christine L. Lisetti,et al.  Using Noninvasive Wearable Computers to Recognize Human Emotions from Physiological Signals , 2004, EURASIP J. Adv. Signal Process..

[4]  Karl Pearson F.R.S. LIII. On lines and planes of closest fit to systems of points in space , 1901 .

[5]  Chuan-Yu Chang,et al.  Emotion recognition with consideration of facial expression and physiological signals , 2009, 2009 IEEE Symposium on Computational Intelligence in Bioinformatics and Computational Biology.

[6]  D. Thalmann,et al.  Using physiological measures for emotional assessment: A computer-aided tool for cognitive and behavioural therapy , 2004 .

[7]  Shuzhi Sam Ge,et al.  Facial expression imitation in human robot interaction , 2008, RO-MAN 2008 - The 17th IEEE International Symposium on Robot and Human Interactive Communication.

[8]  P. Lang,et al.  International Affective Picture System (IAPS): Instruction Manual and Affective Ratings (Tech. Rep. No. A-4) , 1999 .

[9]  Maria E. Jabon,et al.  Real-time classification of evoked emotions using facial feature tracking and physiological responses , 2008, Int. J. Hum. Comput. Stud..

[10]  Jennifer Healey,et al.  Toward Machine Emotional Intelligence: Analysis of Affective Physiological State , 2001, IEEE Trans. Pattern Anal. Mach. Intell..

[11]  Mann Oo. Hay Emotion recognition in human-computer interaction , 2012 .

[12]  Olivier Villon,et al.  Toward Building Adaptive User's Psycho-Physiological Maps of Emotions using Bio-Sensors , 2006 .

[13]  J.-Y. Bouguet,et al.  Pyramidal implementation of the lucas kanade feature tracker , 1999 .

[14]  Paul A. Viola,et al.  Robust Real-time Object Detection , 2001 .

[15]  L. Rothkrantz,et al.  Toward an affect-sensitive multimodal human-computer interaction , 2003, Proc. IEEE.

[16]  Thomas M. Cover,et al.  Elements of Information Theory , 2005 .

[17]  Zhihong Zeng,et al.  A Survey of Affect Recognition Methods: Audio, Visual, and Spontaneous Expressions , 2007, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[18]  Chih-Jen Lin,et al.  LIBSVM: A library for support vector machines , 2011, TIST.