EMOTIC: Emotions in Context Dataset

Recognizing people's emotions from their frame of reference is very important in our everyday life. This capacity helps us to perceive or predict the subsequent actions of people, interact effectively with them and to be sympathetic and sensitive toward them. Hence, one should expect that a machine needs to have a similar capability of understanding people's feelings in order to correctly interact with humans. Current research on emotion recognition has focused on the analysis of facial expressions. However, recognizing emotions requires also understanding the scene in which a person is immersed. The unavailability of suitable data to study such a problem has made research in emotion recognition in context difficult. In this paper, we present the EMOTIC database (from EMOTions In Context), a database of images with people in real environments, annotated with their apparent emotions. We defined an extended list of 26 emotion categories to annotate the images, and combined these annotations with three common continuous dimensions: Valence, Arousal, and Dominance. Images in the database are annotated using the Amazon Mechanical Turk (AMT) platform. The resulting set contains 18,313 images with 23,788 annotated people. The goal of this paper is to present the EMOTIC database, detailing how it was created and the information available. We expect this dataset can help to open up new horizons on creating systems able of recognizing rich information about people's apparent emotional states.

[1]  Maja Pantic,et al.  Discriminative Shared Gaussian Processes for Multiview and View-Invariant Facial Expression Recognition , 2015, IEEE Transactions on Image Processing.

[2]  Yong Tao,et al.  Compound facial expressions of emotion , 2014, Proceedings of the National Academy of Sciences.

[3]  B. Mesquita,et al.  Context in Emotion Perception , 2011 .

[4]  Luc Van Gool,et al.  Recognizing emotions expressed by body pose: A biologically inspired neural model , 2008, Neural Networks.

[5]  Àgata Lapedriza,et al.  Emotion Recognition in Context , 2017, 2017 IEEE Conference on Computer Vision and Pattern Recognition (CVPR).

[6]  Aleix M. Martínez,et al.  EmotioNet: An Accurate, Real-Time Algorithm for the Automatic Annotation of a Million Facial Expressions in the Wild , 2016, 2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR).

[7]  Maja Pantic,et al.  Joint Facial Action Unit Detection and Feature Fusion: A Multi-conditional Learning Approach. , 2016, IEEE transactions on image processing : a publication of the IEEE Signal Processing Society.

[8]  Bolei Zhou,et al.  Semantic Understanding of Scenes Through the ADE20K Dataset , 2016, International Journal of Computer Vision.

[9]  Jesse Hoey,et al.  EmotiW 2016: video and group-level emotion recognition challenges , 2016, ICMI.

[10]  Tamás D. Gedeon,et al.  Static facial expression analysis in tough conditions: Data, evaluation protocol and benchmark , 2011, 2011 IEEE International Conference on Computer Vision Workshops (ICCV Workshops).

[11]  P. Ekman,et al.  Facial action coding system: a technique for the measurement of facial movement , 1978 .

[12]  James Hays,et al.  COCO Attributes: Attributes for People, Animals, and Objects , 2016, ECCV.

[13]  Bolei Zhou,et al.  Learning Deep Features for Scene Recognition using Places Database , 2014, NIPS.

[14]  Bolei Zhou,et al.  Learning Deep Features for Discriminative Localization , 2015, 2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR).

[15]  Masahide Kaneko,et al.  Facial-component-based bag of words and PHOG descriptor for facial expression recognition , 2009, 2009 IEEE International Conference on Systems, Man and Cybernetics.

[16]  Andrea Kleinsmith,et al.  Recognizing Affective Dimensions from Body Posture , 2007, ACII.

[17]  Jessica L. Tracy,et al.  Development of a Facs-verified Set of Basic and Self-conscious Emotion Expressions Extant Facs-verified Sets , 2009 .

[18]  Anthony Steed,et al.  Automatic Recognition of Non-Acted Affective Postures , 2011, IEEE Transactions on Systems, Man, and Cybernetics, Part B (Cybernetics).

[19]  A. Mehrabian Framework for a comprehensive description and measurement of emotional states. , 1995, Genetic, social, and general psychology monographs.

[20]  Tamás D. Gedeon,et al.  Collecting Large, Richly Annotated Facial-Expression Databases from Movies , 2012, IEEE MultiMedia.

[21]  P. Ekman,et al.  Constants across cultures in the face and emotion. , 1971, Journal of personality and social psychology.

[22]  Bolei Zhou,et al.  Places: An Image Database for Deep Scene Understanding , 2016, ArXiv.

[23]  Roland Göcke,et al.  Finding Happiest Moments in a Social Context , 2012, ACCV.

[24]  Fernando De la Torre,et al.  Selective Transfer Machine for Personalized Facial Expression Analysis , 2017, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[25]  Pietro Perona,et al.  Microsoft COCO: Common Objects in Context , 2014, ECCV.

[26]  Mohammad Soleymani,et al.  Analysis of EEG Signals and Facial Expressions for Continuous Emotion Detection , 2016, IEEE Transactions on Affective Computing.

[27]  Maja Pantic,et al.  Expert system for automatic analysis of facial expressions , 2000, Image Vis. Comput..

[28]  Hatice Gunes,et al.  Continuous Prediction of Spontaneous Affect from Multiple Cues and Modalities in Valence-Arousal Space , 2011, IEEE Transactions on Affective Computing.