Computational emotion recognition using multimodal physiological signals: Elicited using Japanese kanji words

This paper investigates computational emotion recognition using multimodal physiological signals. Four physiological signs - plethysmogram, skin conductance change, respiration rate and skin temperature - are measured to evaluate three emotions: positive, negative and neutral. Psychophysical experiments are conducted using Japanese kanji words in order to excite emotions in subjects so as to elicit physiological signals. For computational emotion recognition, machine-learning approaches, such as multilayer neural networks, support vector machines, decision trees and random forests, are used to design emotion recognition systems and their characteristics are investigated. In computational experiments conducted for recognising emotions, support vector machines equipped with a Gaussian kernel function attain a maximum averaged recognition rate of around 40% for all three emotions and around 56% for two emotions (positive and negative). The results obtained in this study shows that using multimodal physiological signals with a machine-learning approach is feasible and suited for computational emotion recognition.

[1]  K. Takahashi Remarks on computational emotion recognition from vital information , 2009, 2009 Proceedings of 6th International Symposium on Image and Signal Processing and Analysis.

[2]  Richard D. Roberts,et al.  The science of emotional intelligence : knowns and unknowns , 2008 .

[3]  Johannes Wagner,et al.  From Physiological Signals to Emotions: Implementing and Comparing Selected Methods for Feature Extraction and Classification , 2005, 2005 IEEE International Conference on Multimedia and Expo.

[4]  J. Russell A circumplex model of affect. , 1980 .

[5]  Kazuhiko Takahashi Comparison of Emotion Recognition Methods from Bio-Potential Signals , 2004 .

[6]  Omar AlZoubi,et al.  Affect Detection from Multichannel Physiology during Learning Sessions with AutoTutor , 2011, AIED.

[7]  Jennifer Healey,et al.  Toward Machine Emotional Intelligence: Analysis of Affective Physiological State , 2001, IEEE Trans. Pattern Anal. Mach. Intell..

[8]  Masafumi Hagiwara,et al.  A feeling estimation system using a simple electroencephalograph , 2003, SMC'03 Conference Proceedings. 2003 IEEE International Conference on Systems, Man and Cybernetics. Conference Theme - System Security and Assurance (Cat. No.03CH37483).

[9]  Daniela Giordano,et al.  A fuzzy processor based distributed system for implementing affective human-computer interfaces , 2004, 2004 IEEE International Conference on Systems, Man and Cybernetics (IEEE Cat. No.04CH37583).

[10]  Nicu Sebe,et al.  Multimodal Human Computer Interaction: A Survey , 2005, ICCV-HCI.

[11]  Alain Pruski,et al.  Emotion recognition for human-machine communication , 2008, 2008 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[12]  Joseph Bullington,et al.  'Affective' computing and emotion recognition systems: the future of biometric surveillance? , 2005, InfoSecCD '05.

[13]  M.E. Miiller Why Some Emotional States Are Easier to be Recognized Than Others: A thorough data analysis and a very accurate rough set classifier , 2006, 2006 IEEE International Conference on Systems, Man and Cybernetics.

[14]  Rosalind W. Picard Affective computing: challenges , 2003, Int. J. Hum. Comput. Stud..

[15]  Jason Williams,et al.  Emotion Recognition Using Bio-sensors: First Steps towards an Automatic System , 2004, ADS.

[16]  Igor Kardum,et al.  Relations Between Dispositional Expressivity and Physiological Changes During Acute Positive and Negative Affect , 2007 .

[17]  Robert D. Ward,et al.  Affective computing: problems, reactions and intentions , 2004, Interact. Comput..

[18]  Yaacob Sazali,et al.  Classification of human emotion from EEG using discrete wavelet transform , 2010 .