A Distributed Architecture for Multimodal Emotion Identification

This paper introduces a distributed multiagent system architecture for multimodal emotion identification, which is based on simultaneous analysis of physiological parameters from wearable devices, human behaviors and activities, and facial micro-expressions. Wearable devices are equipped with electrodermal activity, electrocardiogram, heart rate, and skin temperature sensor agents. Facial expressions are monitored by a vision agent installed at the height of the human’s head. Also, the activity of the user is monitored by a second vision agent mounted overhead. The emotion is refined as a cooperative decision taken at a central agent node denominated “Central Emotion Detection Node” from the local decision offered by the three agent nodes called “Face Expression Analysis Node”, “Behavior Analysis Node” and “Physiological Data Analysis Node”. This way, the emotion identification results are outperformed through an intelligent fuzzy-based decision making technique.

[1]  J. Gross,et al.  The tie that binds? Coherence among emotion experience, behavior, and physiology. , 2005, Emotion.

[2]  C. Darwin The Expression of the Emotions in Man and Animals , .

[3]  C. Janelle,et al.  Emotional state and initiating cue alter central and peripheral motor processes. , 2007, Emotion.

[4]  Antonio Fernández-Caballero,et al.  A Framework for Recognizing and Regulating Emotions in the Elderly , 2014, IWAAL.

[5]  Rosalind W. Picard Emotion Research by the People, for the People , 2010 .

[6]  M. Murugappan Electromyogram signal based human emotion classification using KNN and LDA , 2011, 2011 IEEE International Conference on System Engineering and Technology.

[7]  Rosalind W. Picard,et al.  A computational model for the automatic recognition of affect in speech , 2004 .

[8]  Zhihong Zeng,et al.  A Survey of Affect Recognition Methods: Audio, Visual, and Spontaneous Expressions , 2009, IEEE Trans. Pattern Anal. Mach. Intell..

[9]  José Manuel Pastor,et al.  Improvement of the Elderly Quality of Life and Care through Smart Emotion Regulation , 2014, IWAAL.

[10]  Arend W A Van Gemmert,et al.  Forearm EMG response activity during motor performance in individuals prone to increased stress reactivity. , 2002, American journal of industrial medicine.

[11]  Tieniu Tan,et al.  Affective Computing: A Review , 2005, ACII.

[12]  K. Scherer,et al.  Emotion recognition from expressions in face, voice, and body: the Multimodal Emotion Recognition Test (MERT). , 2009, Emotion.

[13]  Valery A. Petrushin,et al.  Emotion recognition in speech signal: experimental study, development, and application , 2000, INTERSPEECH.

[14]  N. Ambady,et al.  Thin slices of expressive behavior as predictors of interpersonal consequences: A meta-analysis. , 1992 .

[15]  Marko Horvat,et al.  Comparative analysis of emotion estimation methods based on physiological measurements for real-time applications , 2014, Int. J. Hum. Comput. Stud..

[16]  G. Stemmler,et al.  The autonomic differentiation of emotions revisited: convergent and discriminant validation. , 1989, Psychophysiology.

[17]  K. H. Kim,et al.  Emotion recognition system using short-term monitoring of physiological signals , 2004, Medical and Biological Engineering and Computing.

[18]  Antonio Fernández-Caballero,et al.  Facial Expression Recognition from Webcam Based on Active Shape Models and Support Vector Machines , 2014, IWAAL.

[19]  Florentino Fernández Riverola,et al.  A Ubiquitous and Low-Cost Solution for Movement Monitoring and Accident Detection Based on Sensor Fusion , 2014, Sensors.

[20]  Michelle Karg,et al.  Body Movements for Affective Expression: A Survey of Automatic Recognition and Generation , 2013, IEEE Transactions on Affective Computing.

[21]  Rosalind W. Picard Affective computing: challenges , 2003, Int. J. Hum. Comput. Stud..

[22]  P. Ekman,et al.  DIFFERENCES Universals and Cultural Differences in the Judgments of Facial Expressions of Emotion , 2004 .