A Novel Bimodal Emotion Database from Physiological Signals and Facial Expression

SUMMARY In this paper, we establish a novel bimodal emotion database from physiological signals and facial expression, which is named as PSFE. The physiological signals and facial expression of the PSFE database are respectively recorded by the equipment of the BIOPAC MP 150 and the Kinect for Windows in the meantime. The PSFE database alto- gether records 32 subjects which include 11 women and 21 man, and their age distribution is from 20 to 25. Moreover, the PSFE database records three basic emotion classes containing calmness, happiness and sadness, which respectively correspond to the neutral, positive and negative emo- tion state. The general sample number of the PSFE database is 288 and each emotion class contains 96 samples.

[1]  Xiaodong Bai,et al.  A Novel Supervised Bimodal Emotion Recognition Approach Based on Facial Expression and Body Gesture , 2018, IEICE Trans. Fundam. Electron. Commun. Comput. Sci..

[2]  Björn W. Schuller,et al.  Speech emotion recognition , 2018, Commun. ACM.

[3]  Erik Cambria,et al.  A review of affective computing: From unimodal analysis to multimodal fusion , 2017, Inf. Fusion.

[4]  Jesse Hoey,et al.  EmotiW 2016: video and group-level emotion recognition challenges , 2016, ICMI.

[5]  Haibo Li,et al.  Sparse Kernel Reduced-Rank Regression for Bimodal Emotion Recognition From Facial Expression and Speech , 2016, IEEE Transactions on Multimedia.

[6]  Fabien Ringeval,et al.  AV+EC 2015: The First Affect Recognition Challenge Bridging Across Audio, Video, and Physiological Data , 2015, AVEC@ACM Multimedia.

[7]  Sungyoung Lee,et al.  Human Facial Expression Recognition Using Stepwise Linear Discriminant Analysis and Hidden Conditional Random Fields , 2015, IEEE Transactions on Image Processing.

[8]  Gyanendra K. Verma,et al.  Multimodal fusion framework: A multiresolution approach for emotion classification and recognition from physiological signals , 2014, NeuroImage.

[9]  Wenming Zheng,et al.  Facial Expression Recognition Based on Sparse Locality Preserving Projection , 2014, IEICE Trans. Fundam. Electron. Commun. Comput. Sci..

[10]  Wanhui Wen,et al.  Emotion Recognition Based on Multi-Variant Correlation of Physiological Signals , 2014, IEEE Transactions on Affective Computing.

[11]  Wenming Zheng,et al.  Integrating Facial Expression and Body Gesture in Videos for Emotion Recognition , 2014, IEICE Trans. Inf. Syst..

[12]  Wenming Zheng,et al.  Multi-View Facial Expression Recognition Based on Group Sparse Reduced-Rank Regression , 2014, IEEE Transactions on Affective Computing.

[13]  Jingjie Yan,et al.  Speech Emotion Recognition Based on Sparse Representation , 2013 .

[14]  Wioleta Szwoch,et al.  Using physiological signals for emotion recognition , 2013, International Conference on Human System Interaction.

[15]  Thierry Pun,et al.  DEAP: A Database for Emotion Analysis ;Using Physiological Signals , 2012, IEEE Transactions on Affective Computing.

[16]  Mohammad Soleymani,et al.  A Multimodal Database for Affect Recognition and Implicit Tagging , 2012, IEEE Transactions on Affective Computing.

[17]  M Murugappan,et al.  Physiological signals based human emotion Recognition: a review , 2011, 2011 IEEE 7th International Colloquium on Signal Processing and its Applications.

[18]  Fakhri Karray,et al.  Survey on speech emotion recognition: Features, classification schemes, and databases , 2011, Pattern Recognit..

[19]  Etienne B. Roesch,et al.  A Blueprint for Affective Computing: A Sourcebook and Manual , 2010 .

[20]  K. Scherer,et al.  Introducing the Geneva Multimodal Emotion Portrayal (GEMEP) corpus , 2010 .

[21]  Zhihong Zeng,et al.  A Survey of Affect Recognition Methods: Audio, Visual, and Spontaneous Expressions , 2007, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[22]  Elisabeth André,et al.  Emotion recognition based on physiological changes in music listening , 2008, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[23]  Takeo Kanade,et al.  Multi-PIE , 2008, 2008 8th IEEE International Conference on Automatic Face & Gesture Recognition.

[24]  Hatice Gunes,et al.  From the Lab to the real world: affect recognition using multiple cues and modalities , 2008 .

[25]  Ling Guan,et al.  Recognizing Human Emotional State From Audiovisual Signals , 2008, IEEE Transactions on Multimedia.

[26]  Hatice Gunes,et al.  Bi-modal emotion recognition from expressive face and body gestures , 2007, J. Netw. Comput. Appl..

[27]  Zhihong Zeng,et al.  Audio-Visual Affect Recognition , 2007, IEEE Transactions on Multimedia.

[28]  Hatice Gunes,et al.  A Bimodal Face and Body Gesture Database for Automatic Analysis of Human Nonverbal Affective Behavior , 2006, 18th International Conference on Pattern Recognition (ICPR'06).

[29]  Ioannis Pitas,et al.  The eNTERFACE’05 Audio-Visual Emotion Database , 2006, 22nd International Conference on Data Engineering Workshops (ICDEW'06).

[30]  K. H. Kim,et al.  Emotion recognition system using short-term monitoring of physiological signals , 2004, Medical and Biological Engineering and Computing.

[31]  W. Zheng,et al.  Facial expression recognition using kernel canonical correlation analysis (KCCA) , 2006, IEEE Transactions on Neural Networks.

[32]  Johannes Wagner,et al.  From Physiological Signals to Emotions: Implementing and Comparing Selected Methods for Feature Extraction and Classification , 2005, 2005 IEEE International Conference on Multimedia and Expo.

[33]  Astrid Paeschke,et al.  A database of German emotional speech , 2005, INTERSPEECH.

[34]  Jennifer Healey,et al.  Toward Machine Emotional Intelligence: Analysis of Affective Physiological State , 2001, IEEE Trans. Pattern Anal. Mach. Intell..

[35]  Michael J. Lyons,et al.  Automatic Classification of Single Facial Images , 1999, IEEE Trans. Pattern Anal. Mach. Intell..

[36]  J. Gross,et al.  Emotion elicitation using films , 1995 .