A Computational Model for Mood Recognition

In an ambience designed to adapt to the user’s affective state, pervasive technology should be able to decipher unobtrusively his underlying mood. Great effort has been devoted to automatic punctual emotion recognition from visual input. Conversely, little has been done to recognize longer-lasting affective states, such as mood. Taking for granted the effectiveness of emotion recognition algorithms, we go one step further and propose a model for estimating the mood of an affective episode from a known sequence of punctual emotions. To validate our model experimentally, we rely on the human annotations of the well-established HUMAINE database. Our analysis indicates that we can approximate fairly accurately the human process of summarizing the emotional content of a video in a mood estimation. A moving average function with exponential discount of the past emotions achieves mood prediction accuracy above 60%.

[1]  C. Darwin The Expression of the Emotions in Man and Animals , .

[2]  J. Russell A circumplex model of affect. , 1980 .

[3]  A. Mehrabian Pleasure-arousal-dominance: A general framework for describing and measuring individual differences in Temperament , 1996 .

[4]  Patrick Gebhard,et al.  ALMA: a layered model of affect , 2005, AAMAS '05.

[5]  T. Dalgleish Basic Emotions , 2004 .

[6]  J. Russell Core affect and the psychological construction of emotion. , 2003, Psychological review.

[7]  Joyce H. D. M. Westerink,et al.  Mood Recognition Based on Upper Body Posture and Movement Features , 2011, ACII.

[8]  M. Bradley Emotional Memory: A Dimensional Analysis , 2014 .

[9]  P. Terry,et al.  Distinctions between emotion and mood , 2005 .

[10]  Andrea Kleinsmith,et al.  Affective Body Expression Perception and Recognition: A Survey , 2013, IEEE Transactions on Affective Computing.

[11]  Francesco Ricci,et al.  User Modeling, Adaptation, and Personalization , 2014, Lecture Notes in Computer Science.

[12]  Cristina Conati,et al.  Empirically building and evaluating a probabilistic model of user affect , 2009, User Modeling and User-Adapted Interaction.

[13]  K. Oatley,et al.  Human emotions : a reader , 1998 .

[14]  Alan Hanjalic,et al.  Affective video content representation and modeling , 2005, IEEE Transactions on Multimedia.

[15]  Roddy Cowie,et al.  AVEC 2012: the continuous audio/visual emotion challenge - an introduction , 2012, ICMI.

[16]  Andre Kuijsters,et al.  Improving the Mood of Elderly with Coloured Lighting , 2011, AmI Workshops.

[17]  C. Darwin,et al.  The Expression of the Emotions in Man and Animals , 1956 .

[18]  Rob B. Briner,et al.  Changing Moods: The Psychology of Mood and Mood Regulation , 1996 .

[19]  David J. Fleet,et al.  Human Attributes from 3D Pose Tracking , 2010, ECCV.

[20]  Kostas Karpouzis,et al.  The HUMAINE Database: Addressing the Collection and Annotation of Naturalistic and Induced Emotional Data , 2007, ACII.

[21]  P. Ekman An argument for basic emotions , 1992 .

[22]  Alois Ferscha,et al.  Constructing Ambient Intelligence - AmI 2007 Workshops Darmstadt, Germany, November 7-10, 2007 Revised Papers , 2008, AmI Workshops.

[23]  Andrew M. Lane,et al.  The Nature of Mood: Development of a Conceptual Model with a Focus on Depression , 2000 .

[24]  Björn W. Schuller,et al.  AVEC 2012: the continuous audio/visual emotion challenge , 2012, ICMI '12.

[25]  E. Vesterinen,et al.  Affective Computing , 2009, Encyclopedia of Biometrics.

[26]  Fernando De la Torre,et al.  Facial Expression Analysis , 2011, Visual Analysis of Humans.

[27]  Thomas Deselaers,et al.  ClassCut for Unsupervised Class Segmentation , 2010, ECCV.

[28]  Angeliki Metallinou,et al.  Annotation and processing of continuous emotional attributes: Challenges and opportunities , 2013, 2013 10th IEEE International Conference and Workshops on Automatic Face and Gesture Recognition (FG).

[29]  R. Thayer The Origin of Everyday Moods: Managing Energy, Tension, and Stress , 1996 .

[30]  David J. Fleet,et al.  Human attributes from 3D pose tracking , 2010, Comput. Vis. Image Underst..

[31]  Tobias Baur,et al.  Modelling Users' Affect in Job Interviews: Technological Demo , 2013, UMAP.