Perception of Emotion in Body Expressions from Gaze Behavior

Developing affectively aware technologies is a growing industry. To build them effectively, it is important to understand the features involved in discriminating between emotions. While many technologies focus on facial expressions, studies have highlighted the influence of body expressions over other modalities for perceiving some emotions. Eye tracking studies have evaluated the combination of face and body to investigate the influence of each modality, however, few to none have investigated the perception of emotion from body expressions alone. This exploratory study aimed to evaluate the discriminative importance of dynamic body features for decoding emotion. Eye tracking was used to monitor participants' eye gaze behavior while viewing clips of non-acted body movements to which they associated an emotion. Preliminary results indicate that the two primary regions attended to most often and longest were the torso and the arms. Further analysis is ongoing, however initial results independently confirm prior studies without eye tracking.

[1]  Robert F. DeVellis,et al.  Scale Development: Theory and Applications. , 1992 .

[2]  B. Gelder Towards the neurobiology of emotional body language , 2006, Nature Reviews Neuroscience.

[3]  Anthony Steed,et al.  Automatic Recognition of Non-Acted Affective Postures , 2011, IEEE Transactions on Systems, Man, and Cybernetics, Part B (Cybernetics).

[4]  Beatrice de Gelder,et al.  Emotional signals from faces, bodies and scenes influence observers' face expressions, fixations and pupil-size , 2013, Front. Hum. Neurosci..

[5]  Dwight J. Kravitz,et al.  Start Position Strongly Influences Fixation Patterns during Face Processing: Difficulties with Eye Movements as a Measure of Information Use , 2012, PloS one.

[6]  Daniel McDuff,et al.  AFFDEX SDK: A Cross-Platform Real-Time Multi-Face Expression Recognition Toolkit , 2016, CHI Extended Abstracts.

[7]  Lesley K. Fellows,et al.  Eye spy: The predictive value of fixation patterns in detecting subtle and extreme emotions from faces , 2014, Cognition.

[8]  Michelle Karg,et al.  Body Movements for Affective Expression: A Survey of Automatic Recognition and Generation , 2013, IEEE Transactions on Affective Computing.

[9]  G. Alpers,et al.  Happy mouth and sad eyes: scanning emotional facial expressions. , 2011, Emotion.

[10]  J. Malouff,et al.  Development and validation of a measure of emotional intelligence. , 1998 .

[11]  Benjamin W Tatler,et al.  The central fixation bias in scene viewing: selecting an optimal viewing position independently of motor biases and image feature distributions. , 2007, Journal of vision.

[12]  Matthew F. Peterson,et al.  Looking just below the eyes is optimal across face recognition tasks , 2012, Proceedings of the National Academy of Sciences.

[13]  H. Meeren,et al.  Rapid perceptual integration of facial expression and emotional body language. , 2005, Proceedings of the National Academy of Sciences of the United States of America.

[14]  B. de Gelder,et al.  Body expressions influence recognition of emotions in the face and voice. , 2007, Emotion.

[15]  K. Roelofs,et al.  Perception of Face and Body Expressions Using Electromyography, Pupillometry and Gaze Measures , 2013, Front. Psychology.

[16]  Andrea Kleinsmith,et al.  Affective Body Expression Perception and Recognition: A Survey , 2013, IEEE Transactions on Affective Computing.

[17]  B. de Gelder,et al.  Orienting to threat: faster localization of fearful facial expressions and body postures revealed by saccadic eye movements , 2009, Proceedings of the Royal Society B: Biological Sciences.

[18]  Andrea Cavallaro,et al.  Automatic Analysis of Facial Affect: A Survey of Registration, Representation, and Recognition , 2015, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[19]  Thomas Armstrong,et al.  Eye tracking of attention in the affective disorders: a meta-analytic review and synthesis. , 2012, Clinical psychology review.