Expressive motions recognition and analysis with learning and statistical methods

This paper proposes to recognize and analyze expressive gestures using a descriptive motion language, the Laban Movement Analysis (LMA) method. We extract body features based on LMA factors which describe both quantitative and qualitative aspects of human movement. In the direction of our study, a dataset of 5 gestures performed with 4 emotions is created using the motion capture Xsens. We used two different approaches for emotions analysis and recognition. The first one is based on a machine learning method, the Random Decision Forest. The second approach is based on the human’s perception. We derive the most important features for each expressed emotion using the same methods, the RDF and the human’s ratings. We compared the results obtained from the automatic learning method against human perception in the discussion section.

[1]  Yu Zheng,et al.  Urban Water Quality Prediction Based on Multi-Task Multi-View Learning , 2016, IJCAI.

[2]  Megumi Masuda,et al.  Motion rendering system for emotion expression of human form robots based on Laban movement analysis , 2010, 19th International Symposium in Robot and Human Interactive Communication.

[3]  J. Montepare,et al.  The Use of Body Movements and Gestures as Cues to Emotions in Younger and Older Adults , 1999 .

[4]  C. Christodoulou,et al.  Comparing different classifiers for automatic age estimation , 2004, IEEE Transactions on Systems, Man, and Cybernetics, Part B (Cybernetics).

[5]  L. Rothkrantz,et al.  Toward an affect-sensitive multimodal human-computer interaction , 2003, Proc. IEEE.

[6]  B. Gelder,et al.  Why bodies? Twelve reasons for including bodily expressions in affective neuroscience , 2009, Philosophical Transactions of the Royal Society B: Biological Sciences.

[7]  Tolga K. Çapin,et al.  Classification of human motion based on affective state descriptors , 2013, Comput. Animat. Virtual Worlds.

[8]  Leo Breiman,et al.  Bagging Predictors , 1996, Machine Learning.

[9]  Jean-Yves Didier,et al.  Gesture recognition for humanoid robot teleoperation , 2017, 2017 26th IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN).

[10]  David S. Rosenblum,et al.  From action to activity: Sensor-based activity recognition , 2016, Neurocomputing.

[11]  Norman I. Badler,et al.  Acquiring and validating motion qualities from live limb gestures , 2005, Graph. Model..

[12]  Andreas Aristidou,et al.  Emotion Recognition for Exergames using Laban Movement Analysis , 2013, MIG.

[13]  Norman I. Badler,et al.  Efficient motion retrieval in large motion databases , 2013, I3D '13.

[14]  Luming Zhang,et al.  Action2Activity: Recognizing Complex Activities from Sensor Data , 2015, IJCAI.

[15]  Emilia I. Barakova,et al.  Communicating emotions and mental states to robots in a real time parallel framework using Laban movement analysis , 2010, Robotics Auton. Syst..

[16]  K. Scherer,et al.  Bodily expression of emotion , 2009 .

[17]  Kenji Amaya,et al.  Emotion from Motion , 1996, Graphics Interface.

[18]  Andreas Aristidou,et al.  LMA-Based Motion Retrieval for Folk Dance Cultural Heritage , 2014, EuroMed.

[19]  Rama Chellappa,et al.  Face Verification Across Age Progression , 2006, IEEE Transactions on Image Processing.

[20]  Andrea Kleinsmith,et al.  Affective Body Expression Perception and Recognition: A Survey , 2013, IEEE Transactions on Affective Computing.

[21]  Emilia I. Barakova,et al.  Expressing and interpreting emotional movements in social games with robots , 2010, Personal and Ubiquitous Computing.

[22]  Benoit Huet,et al.  Bimodal Emotion Recognition , 2010, ICSR.

[23]  Takeo Kanade,et al.  Comprehensive database for facial expression analysis , 2000, Proceedings Fourth IEEE International Conference on Automatic Face and Gesture Recognition (Cat. No. PR00580).

[24]  Norman I. Badler,et al.  The EMOTE model for effort and shape , 2000, SIGGRAPH.

[25]  Jean-Yves Didier,et al.  Robust human action recognition system using Laban Movement Analysis , 2017, KES.

[26]  Niels da Vitoria Lobo,et al.  Age Classification from Facial Images , 1999, Comput. Vis. Image Underst..

[27]  T. Shafir,et al.  Emotion Regulation through Movement: Unique Sets of Movement Characteristics are Associated with and Enhance Basic Emotions , 2016, Front. Psychol..

[28]  Leo Breiman,et al.  Random Forests , 2001, Machine Learning.

[29]  Norman I. Badler,et al.  Perform: perceptual approach for adding OCEAN personality to human motion using laban movement analysis , 2016, TOGS.

[30]  M. Tavakol,et al.  Making sense of Cronbach's alpha , 2011, International journal of medical education.

[31]  Thomas S. Huang,et al.  Emotional expressions in audiovisual human computer interaction , 2000, 2000 IEEE International Conference on Multimedia and Expo. ICME2000. Proceedings. Latest Advances in the Fast Changing World of Multimedia (Cat. No.00TH8532).

[32]  Thomas G. Dietterich Multiple Classifier Systems , 2000, Lecture Notes in Computer Science.

[33]  Abdul Wahab,et al.  Driver behavior analysis through speech emotion understanding , 2010, 2010 IEEE Intelligent Vehicles Symposium.

[34]  David P. Dobkin,et al.  The quickhull algorithm for convex hulls , 1996, TOMS.

[35]  R. Laban,et al.  The mastery of movement , 1950 .

[36]  Tin Kam Ho,et al.  The Random Subspace Method for Constructing Decision Forests , 1998, IEEE Trans. Pattern Anal. Mach. Intell..

[37]  Luming Zhang,et al.  Fortune Teller: Predicting Your Career Path , 2016, AAAI.

[38]  George Papagiannakis,et al.  Style-based motion analysis for dance composition , 2017, The Visual Computer.