The OMG-Emotion Behavior Dataset

This paper is the basis paper for the accepted IJCNN challenge One-Minute Gradual-Emotion Recognition (OMG-Emotion)1 by which we hope to foster long-emotion classification using neural models for the benefit of the IJCNN community. The proposed corpus has as novelty the data collection and annotation strategy based on emotion expressions which evolve over time into a specific context. Different from other corpora, we propose a novel multimodal corpus for emotion expression recognition, which uses gradual annotations with a focus on contextual emotion expressions. Our dataset was collected from Youtube videos using a specific search strategy based on restricted keywords and filtering which guaranteed that the data follow a gradual emotion expression transition, i.e. emotion expressions evolve over time in a natural and continuous fashion. We also provide an experimental protocol and a series of unimodal baseline experiments which can be used to evaluate deep and recurrent neural models in a fair and standard manner.1https://www2.informatik.uni-hamburg.de/wtm/OMG-EmotionChallenge/

[1]  Stephan Hamann,et al.  Mapping discrete and dimensional emotions onto the brain: controversies and consensus , 2012, Trends in Cognitive Sciences.

[2]  Björn W. Schuller,et al.  From Hard to Soft: Towards more Human-like Emotion Recognition by Modelling the Perception Uncertainty , 2017, ACM Multimedia.

[3]  Mohammad Soleymani,et al.  A survey of multimodal sentiment analysis , 2017, Image Vis. Comput..

[4]  Changchun Liu,et al.  An empirical study of machine learning techniques for affect recognition in human–robot interaction , 2006, 2005 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[5]  Stefan Wermter,et al.  Developing crossmodal expression recognition based on a deep neural model , 2016, Adapt. Behav..

[6]  Ross A. Thompson,et al.  Methods and measures in developmental emotions research: some assembly required. , 2011, Journal of experimental child psychology.

[7]  Rosalind W. Picard,et al.  GIFGIF+: Collecting emotional animated GIFs with clustered multi-task learning , 2017, 2017 Seventh International Conference on Affective Computing and Intelligent Interaction (ACII).

[8]  K. Kroschel,et al.  Evaluation of natural emotions using self assessment manikins , 2005, IEEE Workshop on Automatic Speech Recognition and Understanding, 2005..

[9]  C. Izard Basic emotions, relations among emotions, and emotion-cognition relations. , 1992, Psychological review.

[10]  Louis-Philippe Morency,et al.  MOSI: Multimodal Corpus of Sentiment Intensity and Subjectivity Analysis in Online Opinion Videos , 2016, ArXiv.

[11]  Carlos Busso,et al.  IEMOCAP: interactive emotional dyadic motion capture database , 2008, Lang. Resour. Evaluation.

[12]  Fabio Valente,et al.  The INTERSPEECH 2013 computational paralinguistics challenge: social signals, conflict, emotion, autism , 2013, INTERSPEECH.

[13]  Jesse Hoey,et al.  From individual to group-level emotion recognition: EmotiW 5.0 , 2017, ICMI.

[14]  Björn W. Schuller,et al.  Recent developments in openSMILE, the munich open-source multimedia feature extractor , 2013, ACM Multimedia.

[15]  C. Nickerson A note on a concordance correlation coefficient to evaluate reproducibility , 1997 .

[16]  Louis-Philippe Morency,et al.  EmoReact: a multimodal approach and dataset for recognizing emotional responses in children , 2016, ICMI.

[17]  Guoying Zhao,et al.  Aff-Wild: Valence and Arousal ‘In-the-Wild’ Challenge , 2017, 2017 IEEE Conference on Computer Vision and Pattern Recognition Workshops (CVPRW).

[18]  L. Lin,et al.  A concordance correlation coefficient to evaluate reproducibility. , 1989, Biometrics.

[19]  P. Ekman,et al.  Constants across cultures in the face and emotion. , 1971, Journal of personality and social psychology.

[20]  J. Gross,et al.  Cognitive Emotion Regulation , 2008, Current directions in psychological science.