Automatic Affect Classification of Human Motion Capture Sequences in the Valence-Arousal Model

The problem that we are addressing is that of affect classification: analysing emotions given input data. There are two parts to this study. In the first part, to achieve better recognition and classification of human movement, we investigate that the labels on existing Motion Capture (MoCap) data are consistent with human perception within a reasonable extent. Specifically, we examine movement in terms of valence and arousal (emotion and energy). In part two, we present machine learning techniques for affect classification of human motion capture sequences in both categorical and continuous approaches. For the categorical approach, we evaluate the performance of Hidden Markov Models (HMM). For the continuous approach, we use stepwise linear regression models with the responses of participants from the first part as the ground truth labels for each movement.

[1]  J. Russell A circumplex model of affect. , 1980 .

[2]  Peter F. Driessen,et al.  Gesture-Based Affective Computing on Motion Capture Data , 2005, ACII.

[3]  Dana Kulic,et al.  Affective Movement Recognition Based on Generative and Discriminative Stochastic Dynamic Models , 2014, IEEE Transactions on Human-Machine Systems.

[4]  Andrea Kleinsmith,et al.  Affective Body Expression Perception and Recognition: A Survey , 2013, IEEE Transactions on Affective Computing.

[5]  Hatice Gunes,et al.  Automatic, Dimensional and Continuous Emotion Recognition , 2010, Int. J. Synth. Emot..

[6]  Qiang Ji,et al.  Facial Expression Recognition Using Deep Boltzmann Machine from Thermal Infrared Images , 2013, 2013 Humaine Association Conference on Affective Computing and Intelligent Interaction.

[7]  Hatice Gunes,et al.  Continuous Prediction of Spontaneous Affect from Multiple Cues and Modalities in Valence-Arousal Space , 2011, IEEE Transactions on Affective Computing.

[8]  Athanasia Zlatintsi,et al.  A supervised approach to movie emotion tracking , 2011, 2011 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP).

[9]  B. de Gelder,et al.  Body expressions influence recognition of emotions in the face and voice. , 2007, Emotion.

[10]  Antonio Camurri,et al.  Recognizing emotion from dance movement: comparison of spectator recognition and automated techniques , 2003, Int. J. Hum. Comput. Stud..

[11]  Beat Fasel,et al.  Automati Fa ial Expression Analysis: A Survey , 1999 .

[12]  M. D. Meijer The contribution of general features of body movement to the attribution of emotions , 1989 .

[13]  Shihoko Kamisato,et al.  Extraction of Motion Characteristics Corresponding to Sensitivity Information Using Dance Movement , 2004, J. Adv. Comput. Intell. Intell. Informatics.

[14]  J. Russell,et al.  An approach to environmental psychology , 1974 .

[15]  J. G. Taylor,et al.  Emotion recognition in human-computer interaction , 2005, Neural Networks.

[16]  Andrew Adam andyadam,et al.  Identifying Humans by Their Walk and Generating New Motions Using Hidden Markov Models , 2004 .

[17]  George N. Votsis,et al.  Emotion recognition in human-computer interaction , 2001, IEEE Signal Process. Mag..

[18]  Woontack Woo,et al.  Emotion Recognition from Dance Image Sequences Using Contour Approximation , 2004, SSPR/SPR.

[19]  K. Scherer,et al.  The World of Emotions is not Two-Dimensional , 2007, Psychological science.

[20]  K. Scherer,et al.  Bodily expression of emotion , 2009 .

[21]  Michael Wagner,et al.  Head Pose and Movement Analysis as an Indicator of Depression , 2013, 2013 Humaine Association Conference on Affective Computing and Intelligent Interaction.

[22]  Sung-Bae Cho,et al.  Estimating the efficiency of recognizing gender and affect from biological motion , 2002, Vision Research.

[23]  B. Gelder Towards the neurobiology of emotional body language , 2006, Nature Reviews Neuroscience.

[24]  Paul F. M. J. Verschure,et al.  Expression of emotional states during locomotion based on canonical parameters , 2011, Face and Gesture 2011.

[25]  Ginevra Castellano,et al.  Recognising Human Emotions from Body Movement and Gesture Dynamics , 2007, ACII.

[26]  F. Pollick,et al.  A motion capture library for the study of identity, gender, and emotion perception from biological motion , 2006, Behavior research methods.

[27]  Philippe Pasquier,et al.  Affect-expressive movement generation with factored conditional Restricted Boltzmann Machines , 2015, 2015 International Conference on Affective Computing and Intelligent Interaction (ACII).