The Role of Body Postures in the Recognition of Emotions in Contextually Rich Scenarios

In this article the role of different categories of postures in the detection, recognition, and interpretation of emotion in contextually rich scenarios, including ironic items, is investigated. Animated scenarios are designed with 3D virtual agents in order to test 3 conditions: In the “still” condition, the narrative content was accompanied by emotional facial expressions without any body movements; in the “idle” condition, emotionally neutral body movements were introduced; and in the “congruent” condition, emotional body postures congruent with the character's facial expressions were displayed. Those conditions were examined by 27 subjects, and their impact on the viewers’ attentional and emotional processes was assessed. The results highlight the importance of the contextual information to emotion recognition and irony interpretation. It is also shown that both idle and emotional postures improve the detection of emotional expressions. Moreover, emotional postures increase the perceived intensity of emotions and the realism of the animations.

[1]  S Z Rapcsak,et al.  Fear recognition deficits after focal brain damage: a cautionary note. , 2000, Neurology.

[2]  K. Scherer,et al.  Bodily expression of emotion , 2009 .

[3]  Susan Sullivan,et al.  Older adults' recognition of bodily and auditory expressions of emotion. , 2009, Psychology and aging.

[4]  Frank E. Pollick,et al.  Combining faces and movements to recognize affect , 2004 .

[5]  EmoTV 1 : Annotation of Real-life Emotions for the Specification of Multimodal Affective Interfaces , 2005 .

[6]  O. Grynszpan,et al.  Innovative technology-based interventions for autism spectrum disorders: A meta-analysis , 2014, Autism : the international journal of research and practice.

[7]  P. Ekman,et al.  Detecting deception from the body or face. , 1974 .

[8]  Isabella Poggi,et al.  Irony in a judicial debate: analyzing the subtleties of irony while testing the subtleties of an annotation scheme , 2007, Lang. Resour. Evaluation.

[9]  Kai Vogeley,et al.  Responses to Nonverbal Behaviour of Dynamic Virtual Characters in High-Functioning Autism , 2010, Journal of autism and developmental disorders.

[10]  Elizabeth R. Tuminello,et al.  What the face and body reveal: in-group emotion effects and stereotyping of emotion in African American and European American children. , 2011, Journal of experimental child psychology.

[11]  Nadia Bianchi-Berthouze,et al.  Modeling human affective postures: an information theoretic characterization of posture features , 2004, Comput. Animat. Virtual Worlds.

[12]  S. Baron-Cohen,et al.  The Cambridge Mindreading (CAM) Face-Voice Battery: Testing Complex Emotion Recognition in Adults with and without Asperger Syndrome , 2006, Journal of autism and developmental disorders.

[13]  E. Walker,et al.  Diagnostic and Statistical Manual of Mental Disorders , 2013 .

[14]  Matthieu Courgeon,et al.  A new virtual environment paradigm for high functioning autism intended to help attentional disengagement in a social context Bridging the gap between relevance theory and executive dysfunction , 2009, 2009 Virtual Rehabilitation International Conference.

[15]  Mark C. Coulson Attributing Emotion to Static Body Postures: Recognition Accuracy, Confusions, and Viewpoint Dependence , 2004 .

[16]  Stéphanie Buisine,et al.  Empirical investigation of the temporal relations between speech and facial expressions of emotion , 2009, Journal on Multimodal User Interfaces.

[17]  Janet B W Williams Diagnostic and Statistical Manual of Mental Disorders , 2013 .

[18]  Mamiko Sakata,et al.  Human Body as the Medium in Dance Movement , 2004, Int. J. Hum. Comput. Interact..

[19]  K. Scherer,et al.  Emotion expression in body action and posture. , 2012, Emotion.

[20]  F. Volkmar,et al.  Defining and quantifying the social phenotype in autism. , 2002, The American journal of psychiatry.

[21]  Jean-Claude Martin,et al.  Combining Facial and Postural Expressions of Emotions in a Virtual Character , 2009, IVA.

[22]  Hallee Pitterman,et al.  A Test of the Ability to Identify Emotion in Human Standing and Sitting Postures: The Diagnostic Analysis of Nonverbal Accuracy-2 Posture Test (DANVA2-POS) , 2004, Genetic, social, and general psychology monographs.

[23]  Andrea Kleinsmith,et al.  Postural expressions of emotion in a motion captured database and in a humanoid robot , 2009, AFFINE '09.

[24]  Heinrich H. Bülthoff,et al.  Psychophysical evaluation of animated facial expressions , 2005, APGV '05.

[25]  D. Perrett,et al.  Caricaturing facial expressions , 2000, Cognition.

[26]  Changchun Liu,et al.  Physiology-based affect recognition for computer-assisted intervention of children with Autism Spectrum Disorder , 2008, Int. J. Hum. Comput. Stud..

[27]  Manfred Tscheligi,et al.  An experimental setting to measure contextual perception of embodied conversational agents , 2007, ACE '07.

[28]  Clifford Nass,et al.  Evaluating the effects of behavioral realism in embodied agents , 2009, Int. J. Hum. Comput. Stud..

[29]  James C. Lester,et al.  Animated Pedagogical Agents: Face-to-Face Interaction in Interactive Learning Environments , 2000 .

[30]  Seiji Yamada,et al.  How Does the Agents' Appearance Affect Users' Interpretation of the Agents' Attitudes: Experimental Investigation on Expressing the Same Artificial Sounds From Agents With Different Appearances , 2011, Int. J. Hum. Comput. Interact..

[31]  Beatrice de Gelder,et al.  Real Faces, Real Emotions: Perceiving Facial Expressions in Naturalistic Contexts of Voices, Bodies, and Scenes , 2011 .

[32]  J. Hoogstraten,et al.  Head position and spinal position as determinants of perceived emotional state. , 1995, Perceptual and motor skills.

[33]  Michael Neff,et al.  Augmenting Gesture Animation with Motion Capture Data to Provide Full-Body Engagement , 2009, IVA.

[34]  Ning Tan,et al.  Front View vs. Side View of Facial and Postural Expressions of Emotions in a Virtual Character , 2011, Trans. Edutainment.

[35]  Soyoung Kim,et al.  Designing nonverbal communication for pedagogical agents: When less is more , 2009, Comput. Hum. Behav..

[36]  Tomio Watanabe,et al.  Effects of Time Lag of Utterances to Communicative Actions on Embodied Interaction With Robot and CG Character , 2008, Int. J. Hum. Comput. Interact..

[37]  K. Scherer Appraisal considered as a process of multilevel sequential checking. , 2001 .

[38]  Nadia Magnenat-Thalmann,et al.  Personalised real-time idle motion synthesis , 2004, 12th Pacific Conference on Computer Graphics and Applications, 2004. PG 2004. Proceedings..

[39]  S. Parsons,et al.  State-of-the-art of virtual reality technologies for children on the autism spectrum , 2011 .

[40]  Anthony Steed,et al.  Automatic Recognition of Non-Acted Affective Postures , 2011, IEEE Transactions on Systems, Man, and Cybernetics, Part B (Cybernetics).

[41]  J. Cassell,et al.  Authorable Virtual Peers for Autism Spectrum Disorders , 2006 .

[42]  Yulei Fan,et al.  Collaborative Virtual Environment Technology for People With Autism , 2005, Fifth IEEE International Conference on Advanced Learning Technologies (ICALT'05).

[43]  F. Pollick,et al.  The Role of Velocity in Affect Discrimination , 2001 .

[44]  P. Ekman,et al.  Unmasking the face : a guide to recognizing emotions from facial clues , 1975 .

[45]  Jean-Claude Martin,et al.  Multimedia interfaces for users with high functioning autism: An empirical investigation , 2008, Int. J. Hum. Comput. Stud..

[46]  Darren Burke,et al.  Judging approachability on the face of it: the influence of face and body expressions on the perception of approachability. , 2011, Emotion.

[47]  K. Scherer,et al.  Emotion recognition from expressions in face, voice, and body: the Multimodal Emotion Recognition Test (MERT). , 2009, Emotion.

[48]  F. Happé Communicative competence and theory of mind in autism: A test of relevance theory , 1993, Cognition.

[49]  Manfred Tscheligi,et al.  Evaluating User Experience Factors Using Experiments: Expressive Artificial Faces Embedded in Contexts , 2010, Evaluating User Experience in Games.

[50]  James Carifio,et al.  Ten Common Misunderstandings, Misconceptions, Persistent Myths and Urban Legends about Likert Scales and Likert Response Formats and their Antidotes , 2007 .

[51]  Andrea Kleinsmith,et al.  Affective Body Expression Perception and Recognition: A Survey , 2013, IEEE Transactions on Affective Computing.

[52]  P. Mitchell,et al.  Using Virtual Environments for Teaching Social Understanding to 6 Adolescents with Autistic Spectrum Disorders , 2007, Journal of autism and developmental disorders.

[53]  Antonio Camurri,et al.  Recognizing emotion from dance movement: comparison of spectator recognition and automated techniques , 2003, Int. J. Hum. Comput. Stud..

[54]  J. Russell Is there universal recognition of emotion from facial expression? A review of the cross-cultural studies. , 1994, Psychological bulletin.

[55]  Liang Liao,et al.  A fast kernel-based clustering algorithm with application in MRI image segmentation , 2009, 2009 IEEE International Conference on Intelligent Computing and Intelligent Systems.

[56]  H. Meeren,et al.  Rapid perceptual integration of facial expression and emotional body language. , 2005, Proceedings of the National Academy of Sciences of the United States of America.

[57]  Daniel Gepner,et al.  Self-Monitoring of Gaze in High Functioning Autism , 2012, Journal of autism and developmental disorders.

[58]  Regina Bernhaupt,et al.  Using embodied conversational agents in video games to investigate emotional facial expressions , 2011, Entertain. Comput..