Analyze Spontaneous Gestures for Emotional Stress State Recognition: A Micro-gesture Dataset and Analysis with Deep Learning

Emotions are central for human intelligence and should have a similar role in AI. When it comes to emotion recognition, however, analysis cues for robots were mostly limited to human facial expressions and speech. As an alternative important non-verbal communicative fashion, the body gesture is proved to be capable of conveying emotional information which should gain more attention. Inspired by recent researches on micro-expressions, in this paper, we try to explore a specific group of gestures which are spontaneously and unconsciously elicited by inner feelings. These gestures are different from common gestures for facilitating communications or to express feelings on ones own initiative and always ignored in our daily life. This kind of subtle body movements is known as ‘micro-gestures’ (MGs). Work of interpreting the human hidden emotions via these specific gestural behaviors in unconstrained situations, however, is limited. It is because of an unclear correspondence between body movements and emotional states which need multidisciplinary efforts from computer science, psychology, and statistic researchers. To fill the gap, we built a novel Spontaneous Micro-Gesture (SMG) dataset containing 3,692 manually labeled gesture clips. The data collection from 40 participants was conducted through a story-telling game with two emotional state settings. In this paper, we explored the emotional gestures with a sign-based measurement. To verify the latent relationship between emotional states and MGs, we proposed a framework that encodes the objective gestures to a Bayesian network to infer the subjective emotional states. Our experimental results revealed that, most of the participants would do ‘micro-gestures’ spontaneously to relieve their mental strains. We also carried out a human test on ordinary and trained people for comparison. The performance of both our framework and human beings was evaluated on 142 testing instances (71 for each emotional state) by subject-independent testing. To authors’ best knowledge, this is the first presented MG dataset. Results showed that the proposed MG recognition method achieved promising performance. We also showed that MGs could be helpful cues for the recognition of hidden emotional states.

[1]  Maja Pantic,et al.  Social signal processing: Survey of an emerging domain , 2009, Image Vis. Comput..

[2]  Hatice Gunes,et al.  From the Lab to the real world: affect recognition using multiple cues and modalities , 2008 .

[3]  Hatice Gunes,et al.  A Bimodal Face and Body Gesture Database for Automatic Analysis of Human Nonverbal Affective Behavior , 2006, 18th International Conference on Pattern Recognition (ICPR'06).

[4]  Xiaodong Yang,et al.  EigenJoints-based action recognition using Naïve-Bayes-Nearest-Neighbor , 2012, 2012 IEEE Computer Society Conference on Computer Vision and Pattern Recognition Workshops.

[5]  Yue-jia Luo,et al.  Do Bodily Expressions Compete with Facial Expressions? Time Course of Integration of Emotional Signals from the Face and the Body , 2013, PloS one.

[6]  T. Dalgleish Basic Emotions , 2004 .

[7]  Matti Pietikäinen,et al.  A Spontaneous Micro-expression Database: Inducement, collection and baseline , 2013, 2013 10th IEEE International Conference and Workshops on Automatic Face and Gesture Recognition (FG).

[8]  Xin Liu,et al.  Temporal Hierarchical Dictionary with HMM for Fast Gesture Recognition , 2018, 2018 24th International Conference on Pattern Recognition (ICPR).

[9]  Y. Trope,et al.  Body Cues, Not Facial Expressions, Discriminate Between Intense Positive and Negative Emotions , 2012, Science.

[10]  Elizabeth Kuhnke,et al.  Body Language For Dummies , 2007 .

[11]  Alex Pentland,et al.  Honest Signals - How They Shape Our World , 2008 .

[12]  Marvin Karlins,et al.  What Every BODY is Saying: An Ex-FBI Agent's Guide to Speed-Reading People , 2008 .

[13]  Marco Maiocchi,et al.  Towards Automatic and Unobtrusive Recognition of Primary-Process Emotions in Body Postures , 2013, 2013 Humaine Association Conference on Affective Computing and Intelligent Interaction.

[14]  Marina Krakovsky Artificial (emotional) intelligence , 2018, Commun. ACM.

[15]  Marco Tamietto,et al.  Standing up for the body. Recent progress in uncovering the networks involved in the perception of bodies and bodily expressions , 2010, Neuroscience & Biobehavioral Reviews.

[16]  K. Scherer,et al.  The Body Action and Posture Coding System (BAP): Development and Reliability , 2012 .

[17]  Jo-Ellan Dimitrius,et al.  Put Your Best Foot Forward: Make a Great Impression by Taking Control of How Others See You , 2000 .

[18]  Ling Shao,et al.  Leveraging Hierarchical Parametric Networks for Skeletal Joints Based Action Segmentation and Recognition , 2014, 2014 IEEE Conference on Computer Vision and Pattern Recognition.

[19]  P. Ekman,et al.  What the face reveals : basic and applied studies of spontaneous expression using the facial action coding system (FACS) , 2005 .

[20]  Loïc Kessous,et al.  Emotion Recognition through Multiple Modalities: Face, Body Gesture, Speech , 2008, Affect and Emotion in Human-Computer Interaction.

[21]  Yingli Tian,et al.  Subject Adaptive Affection Recognition via Sparse Reconstruction , 2014, 2014 IEEE Conference on Computer Vision and Pattern Recognition Workshops.

[22]  Rajiv Ranjan Sahay,et al.  Rasabodha: Understanding Indian classical dance by recognizing emotions using deep learning , 2018, Pattern Recognit..

[23]  R. E. Axtell Gestures: The Do's and Taboos of Body Language Around the World , 1991 .

[24]  M. Knapp,et al.  Nonverbal communication in human interaction , 1972 .

[25]  Ann Marney The Gift of Fear , 2020, Thomas Aquinas on Faith, Hope, and Love.