Paradigms for the Construction and Annotation of Emotional Corpora for Real-world Human-Computer-Interaction

A major building block for the construction of reliable statistical classifiers in the context of affective human-computer interaction is the collection of training samples that appropriately reflect the complex nature of the desired patterns. This is especially in this application a non-trivial issue as, even though it is easily agreeable that emotional patterns should be incorporated in future computer operating, it is by far not clear how it should be realized. There are still open questions such as which types of emotional patterns to consider together with their degree of helpfulness for computer interactions and the more fundamental question on what emotions do actually occur in this context. In this paper we start by reviewing existing corpora and the respective techniques for the generation of emotional contents and further try to motivate and establish approaches that enable to gather, identify and categorize patterns of human-computer interaction. %Thus we believe it is possible to gather valid and relevant data material for the affective computing community.

[1]  Jennifer A. Healey,et al.  Wearable and automotive systems for affect recognition from physiology , 2000 .

[2]  G. Labouvie-vief,et al.  Age and gender differences in cardiac reactivity and subjective emotion responses to emotional autobiographical memories. , 2003, Emotion.

[3]  Sascha Meudt,et al.  Fusion of Audio-visual Features using Hierarchical Classifier Systems for the Recognition of Affective States and the State of Depression , 2014, ICPRAM.

[4]  Stephan Hamann,et al.  Mapping discrete and dimensional emotions onto the brain: controversies and consensus , 2012, Trends in Cognitive Sciences.

[5]  Günther Palm,et al.  The PIT Corpus of German Multi-Party Dialogues , 2008, LREC.

[6]  P. Ekman,et al.  Pan-Cultural Elements in Facial Displays of Emotion , 1969, Science.

[7]  K. Scherer What are emotions? And how can they be measured? , 2005 .

[8]  P. Lang International affective picture system (IAPS) : affective ratings of pictures and instruction manual , 2005 .

[9]  Sascha Meudt,et al.  A New Multi-class Fuzzy Support Vector Machine Algorithm , 2014, ANNPR.

[10]  U. Nater,et al.  Sex differences in emotional and psychophysiological responses to musical stimuli. , 2006, International journal of psychophysiology : official journal of the International Organization of Psychophysiology.

[11]  Friedhelm Schwenker,et al.  Multimodal Emotion Classification in Naturalistic User Behavior , 2011, HCI.

[12]  Sylvia D. Kreibig,et al.  Cardiovascular, electrodermal, and respiratory response patterns to fear- and sadness-inducing films. , 2007, Psychophysiology.

[13]  Günther Palm,et al.  On the discovery of events in EEG data utilizing information fusion , 2013, Comput. Stat..

[14]  Dietmar F. Rösner,et al.  LAST MINUTE: a Multimodal Corpus of Speech-based User-Companion Interactions , 2012, LREC.

[15]  Takeo Kanade,et al.  Comprehensive database for facial expression analysis , 2000, Proceedings Fourth IEEE International Conference on Automatic Face and Gesture Recognition (Cat. No. PR00580).

[16]  Björn W. Schuller,et al.  LSTM-Modeling of continuous emotions in an audiovisual affect recognition framework , 2013, Image Vis. Comput..

[17]  Stefanie Rukavina,et al.  The Impact of Gender and Sexual Hormones on Automated Psychobiological Emotion Classification , 2013, HCI.

[18]  Markus Kächele,et al.  Using unlabeled data to improve classification of emotional states in human computer interaction , 2013, Journal on Multimodal User Interfaces.

[19]  Johannes Hewig,et al.  A revised film set for the induction of basic emotions. , 2005 .

[20]  Jun-Wen Tan,et al.  Similarities and differences of emotions in human–machine and human–human interactions: what kind of emotions are relevant for future companion systems? , 2014, Ergonomics.

[21]  Günther Palm,et al.  Revisiting AVEC 2011 - An Information Fusion Architecture , 2012, WIRN.

[22]  Ian Daly,et al.  Neural correlates of emotional responses to music: An EEG study , 2014, Neuroscience Letters.

[23]  Astrid Paeschke,et al.  A database of German emotional speech , 2005, INTERSPEECH.

[24]  Maurizio Codispoti,et al.  Watching emotional movies: affective reactions and gender differences. , 2008, International journal of psychophysiology : official journal of the International Organization of Psychophysiology.

[25]  Günther Palm,et al.  Towards Emotion Recognition in Human Computer Interaction , 2012, WIRN.

[26]  Anton Nijholt,et al.  A Multimodal Database for Mimicry Analysis , 2011, ACII.

[27]  Detection of Emotional Events utilizing Support Vector Methods in an Active Learning HCI Scenario , 2014, ERM4HCI '14.

[28]  Björn W. Schuller,et al.  AVEC 2013: the continuous audio/visual emotion and depression recognition challenge , 2013, AVEC@ACM Multimedia.

[29]  Markus Kächele,et al.  Inferring Depression and Affect from Application Dependent Meta Knowledge , 2014, AVEC '14.

[30]  Friedhelm Schwenker,et al.  Kalman Filter Based Classifier Fusion for Affective State Recognition , 2013, MCS.

[31]  Kristen A. Lindquist,et al.  The hundred-year emotion war: are emotions natural kinds or psychological constructions? Comment on Lench, Flores, and Bench (2011). , 2013, Psychological bulletin.

[32]  J. Russell,et al.  Evidence for a three-factor theory of emotions , 1977 .

[33]  Björn W. Schuller,et al.  AVEC 2011-The First International Audio/Visual Emotion Challenge , 2011, ACII.