Predicting Mood from Punctual Emotion Annotations on Videos

A smart environment designed to adapt to a user's affective state should be able to decipher unobtrusively that user's underlying mood. Great effort has been devoted to automatic punctual emotion recognition from visual input. Conversely, little has been done to recognize longer-lasting affective states, such as mood. Taking for granted the effectiveness of emotion recognition algorithms, we propose a model for estimating mood from a known sequence of punctual emotions. To validate our model experimentally, we rely on the human annotations of two well-established databases: the VAM and the HUMAINE. We perform two analyses: the first serves as a proof of concept and tests whether punctual emotions cluster around the mood in the emotion space. The results indicate that emotion annotations, continuous in time and value, facilitate mood estimation, as opposed to discrete emotion annotations scattered randomly within the video timespan. The second analysis explores factors that account for the mood recognition from emotions, by examining how individual human coders perceive the underlying mood of a person. A moving average function with exponential discount of the past emotions achieves mood prediction accuracy above 60 percent, which is higher than the chance level and higher than mutual human agreement.

[1]  Andre Kuijsters,et al.  Improving the Mood of Elderly with Coloured Lighting , 2011, AmI Workshops.

[2]  Carlos Busso,et al.  Correcting Time-Continuous Emotional Labels by Modeling the Reaction Lag of Evaluators , 2015, IEEE Transactions on Affective Computing.

[3]  C. Whissell,et al.  A Dictionary of Affect in Language: IV. Reliability, Validity, and Applications , 1986 .

[4]  Vladimir Pavlovic,et al.  Dynamic Probabilistic CCA for Analysis of Affective Behaviour , 2012, ECCV.

[5]  P. Johnson-Laird,et al.  Towards a Cognitive Theory of Emotions , 1987 .

[6]  Fernando De la Torre,et al.  Facial Expression Analysis , 2011, Visual Analysis of Humans.

[7]  Sugato Chakravarty,et al.  Methodology for the subjective assessment of the quality of television pictures , 1995 .

[8]  Eamonn Keogh Exact Indexing of Dynamic Time Warping , 2002, VLDB.

[9]  M. Bradley,et al.  Measuring emotion: the Self-Assessment Manikin and the Semantic Differential. , 1994, Journal of behavior therapy and experimental psychiatry.

[10]  Andrew M. Lane,et al.  The Nature of Mood: Development of a Conceptual Model with a Focus on Depression , 2000 .

[11]  C. Darwin The Expression of the Emotions in Man and Animals , .

[12]  P. Schnurr,et al.  Mood: The Frame of Mind , 2011 .

[13]  T. Dalgleish Basic Emotions , 2004 .

[14]  Hatice Gunes,et al.  Automatic Segmentation of Spontaneous Data using Dimensional Labels from Multiple Coders , 2010 .

[15]  Björn W. Schuller,et al.  AVEC 2011-The First International Audio/Visual Emotion Challenge , 2011, ACII.

[16]  L. L. Shaw,et al.  Differentiating affect, mood, and emotion: Toward functionally based conceptual distinctions. , 1992 .

[17]  Kostas Karpouzis,et al.  The HUMAINE Database: Addressing the Collection and Annotation of Naturalistic and Induced Emotional Data , 2007, ACII.

[18]  Rob B. Briner,et al.  Changing Moods: The Psychology of Mood and Mood Regulation , 1996 .

[19]  P. Ekman An argument for basic emotions , 1992 .

[20]  K. Oatley,et al.  Human emotions : a reader , 1998 .

[21]  Sergei Vassilvitskii,et al.  k-means++: the advantages of careful seeding , 2007, SODA '07.

[22]  Patrick Gebhard,et al.  ALMA: a layered model of affect , 2005, AAMAS '05.

[23]  Joyce H. D. M. Westerink,et al.  Mood Recognition Based on Upper Body Posture and Movement Features , 2011, ACII.

[24]  J. Russell Core affect and the psychological construction of emotion. , 2003, Psychological review.

[25]  Carlos Busso,et al.  Analysis and Compensation of the Reaction Lag of Evaluators in Continuous Emotional Annotations , 2013, 2013 Humaine Association Conference on Affective Computing and Intelligent Interaction.

[26]  M. Bradley Emotional Memory: A Dimensional Analysis , 2014 .

[27]  Andrea Kleinsmith,et al.  Affective Body Expression Perception and Recognition: A Survey , 2013, IEEE Transactions on Affective Computing.

[28]  Angeliki Metallinou,et al.  Annotation and processing of continuous emotional attributes: Challenges and opportunities , 2013, 2013 10th IEEE International Conference and Workshops on Automatic Face and Gesture Recognition (FG).

[29]  W. Morris A functional analysis of the role of mood in affective systems. , 1992 .

[30]  R. Thayer The Origin of Everyday Moods: Managing Energy, Tension, and Stress , 1996 .

[31]  David J. Fleet,et al.  Human attributes from 3D pose tracking , 2010, Comput. Vis. Image Underst..

[32]  J. Russell A circumplex model of affect. , 1980 .

[33]  Björn W. Schuller,et al.  AVEC 2012: the continuous audio/visual emotion challenge , 2012, ICMI '12.

[34]  L. Devillers,et al.  Issues in Data Collection , 2011 .

[35]  Kristian Kroschel,et al.  RECOGNIZING EMOTIONS IN SPONTANEOUS FACIAL EXPRESSIONS , 2006 .

[36]  Shrikanth S. Narayanan,et al.  The Vera am Mittag German audio-visual emotional speech database , 2008, 2008 IEEE International Conference on Multimedia and Expo.

[37]  Alan Hanjalic,et al.  Affective video content representation and modeling , 2005, IEEE Transactions on Multimedia.