Ranking-Based Affect Estimation of Motion Capture Data in the Valence-Arousal Space

Affect estimation consists of building a predictive model of the perceived affect given stimuli. In this study, we are looking at the perceived affect in full-body motion capture data of various movements. There are two parts to this study. In the first part, we conduct groundtruthing on affective labels of motion capture sequences by hosting a survey on a crowdsourcing platform where participants from all over the world ranked the relative valence and arousal of one motion capture sequences to another. In the second part, we present our experiments with training a machine learning model for pairwise ranking of motion capture data using RankNet. Our analysis shows a reasonable strength in the inter-rater agreement between the participants. The evaluation of the RankNet demonstrates that it can learn to rank the motion capture data, with higher confidence in the arousal dimension compared to the valence dimension.

[1]  K. Scherer,et al.  Bodily expression of emotion , 2009 .

[2]  Hatice Gunes,et al.  Continuous Prediction of Spontaneous Affect from Multiple Cues and Modalities in Valence-Arousal Space , 2011, IEEE Transactions on Affective Computing.

[3]  Athanasia Zlatintsi,et al.  A supervised approach to movie emotion tracking , 2011, 2011 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP).

[4]  F. Sebastian Grassia,et al.  Practical Parameterization of Rotations Using the Exponential Map , 1998, J. Graphics, GPU, & Game Tools.

[5]  J. Russell A circumplex model of affect. , 1980 .

[6]  Justus J. Randolph Free-Marginal Multirater Kappa (multirater K[free]): An Alternative to Fleiss' Fixed-Marginal Multirater Kappa. , 2005 .

[7]  Qiang Ji,et al.  Facial Expression Recognition Using Deep Boltzmann Machine from Thermal Infrared Images , 2013, 2013 Humaine Association Conference on Affective Computing and Intelligent Interaction.

[8]  Franco Scarselli,et al.  SortNet: Learning to Rank by a Neural Preference Function , 2011, IEEE Transactions on Neural Networks.

[9]  F. Pollick,et al.  A motion capture library for the study of identity, gender, and emotion perception from biological motion , 2006, Behavior research methods.

[10]  Antonio Camurri,et al.  Recognizing emotion from dance movement: comparison of spectator recognition and automated techniques , 2003, Int. J. Hum. Comput. Stud..

[11]  B. de Gelder,et al.  Body expressions influence recognition of emotions in the face and voice. , 2007, Emotion.

[12]  J. G. Taylor,et al.  Emotion recognition in human-computer interaction , 2005, Neural Networks.

[13]  Ginevra Castellano,et al.  Recognising Human Emotions from Body Movement and Gesture Dynamics , 2007, ACII.

[14]  Michael Wagner,et al.  Head Pose and Movement Analysis as an Indicator of Depression , 2013, 2013 Humaine Association Conference on Affective Computing and Intelligent Interaction.

[15]  Peter F. Driessen,et al.  Gesture-Based Affective Computing on Motion Capture Data , 2005, ACII.

[16]  Dana Kulic,et al.  Affective Movement Recognition Based on Generative and Discriminative Stochastic Dynamic Models , 2014, IEEE Transactions on Human-Machine Systems.

[17]  Andrea Kleinsmith,et al.  Affective Body Expression Perception and Recognition: A Survey , 2013, IEEE Transactions on Affective Computing.

[18]  Saif Mohammad,et al.  CROWDSOURCING A WORD–EMOTION ASSOCIATION LEXICON , 2013, Comput. Intell..

[19]  Emmanuel Dellandréa,et al.  From crowdsourced rankings to affective ratings , 2014, 2014 IEEE International Conference on Multimedia and Expo Workshops (ICMEW).

[20]  Paul F. M. J. Verschure,et al.  Expression of emotional states during locomotion based on canonical parameters , 2011, Face and Gesture 2011.

[21]  Yoram Singer,et al.  An Efficient Boosting Algorithm for Combining Preferences by , 2013 .

[22]  Georgios N. Yannakakis,et al.  Ratings are Overrated! , 2015, Front. ICT.

[23]  J. Russell,et al.  An approach to environmental psychology , 1974 .

[24]  A. Mehrabian Pleasure-arousal-dominance: A general framework for describing and measuring individual differences in Temperament , 1996 .

[25]  Sung-Bae Cho,et al.  Estimating the efficiency of recognizing gender and affect from biological motion , 2002, Vision Research.

[26]  Beatrice de Gelder,et al.  Towards the neurobiology of emotional body language , 2006, Nature reviews. Neuroscience.

[27]  Philippe Pasquier,et al.  Affect-expressive movement generation with factored conditional Restricted Boltzmann Machines , 2015, 2015 International Conference on Affective Computing and Intelligent Interaction (ACII).

[28]  L. A. Goodman,et al.  Measures of association for cross classifications , 1979 .

[29]  Thore Graepel,et al.  Large Margin Rank Boundaries for Ordinal Regression , 2000 .

[30]  K. Scherer,et al.  The World of Emotions is not Two-Dimensional , 2007, Psychological science.

[31]  Gregory N. Hullender,et al.  Learning to rank using gradient descent , 2005, ICML.

[32]  Shihoko Kamisato,et al.  Extraction of Motion Characteristics Corresponding to Sensitivity Information Using Dance Movement , 2004, J. Adv. Comput. Intell. Intell. Informatics.

[33]  Kivanç Tatar,et al.  Ranking-Based Emotion Recognition for Experimental Music , 2017, ISMIR.

[34]  Woontack Woo,et al.  Emotion Recognition from Dance Image Sequences Using Contour Approximation , 2004, SSPR/SPR.

[35]  H. Wallbott Bodily expression of emotion , 1998 .

[36]  Tie-Yan Liu,et al.  Learning to rank for information retrieval , 2009, SIGIR.

[37]  Emmanuel Dellandréa,et al.  LIRIS-ACCEDE: A Video Database for Affective Content Analysis , 2015, IEEE Transactions on Affective Computing.

[38]  Beat Fasel,et al.  Automati Fa ial Expression Analysis: A Survey , 1999 .

[39]  M. D. Meijer The contribution of general features of body movement to the attribution of emotions , 1989 .

[40]  M. Bradley,et al.  Measuring emotion: the Self-Assessment Manikin and the Semantic Differential. , 1994, Journal of behavior therapy and experimental psychiatry.