Expressive Copying Behavior for Social Agents: A Perceptual Analysis

Successful human interaction commonly involves prototypical exchanges where interactors are engaged, synchronized, and harmonious in their behaviors. The copying of aspects of the other's behavior, at different levels, seems central to establishing and maintaining such empathic connections. Yet, many questions remain unanswered, particularly how it is possible to reflect the same affective content back to the other when the actual motion itself is not exactly the same as theirs. This paper presents a perceptual study in which emotional gestures conducted by an actor were mapped onto synthesized versions generated by an embodied virtual agent. Copying is at the expressive level, where qualities such as the fluidity or expansiveness of gestures are considered, rather than exact low-level motion matching. Participants were later asked to rate the emotional content of video recordings of both the original and the synthesized gestures. A statistical analysis shows that, in most cases, participants associated the emotional content of the agent's gestures with that intended to be expressed by the original actor. The results suggest that a combination of the type of movement performed and its quality is important for successfully communicating emotions.

[1]  Lynn Smith-Lovin,et al.  Emotion Display as a Strategy for Identity Negotiation , 1999 .

[2]  J. Cassell,et al.  More Than Just Another Pretty Face: Embodied Conversational Interface Agents , 1999 .

[3]  Björn W. Schuller,et al.  Meta-classifiers in acoustic and linguistic feature fusion-based affect recognition , 2005, Proceedings. (ICASSP '05). IEEE International Conference on Acoustics, Speech, and Signal Processing, 2005..

[4]  K. Scherer,et al.  Automated Analysis of Body Movement in Emotionally Expressive Piano Performances , 2008 .

[5]  Cristina Conati,et al.  Affective interactions: the computer in the affective loop , 2005, IUI.

[6]  E. Vesterinen,et al.  Affective Computing , 2009, Encyclopedia of Biometrics.

[7]  H. Wallbott Bodily expression of emotion , 1998 .

[8]  Stacy Marsella,et al.  Virtual Rapport , 2006, IVA.

[9]  C. Pelachaud,et al.  Emotion-Oriented Systems: The Humaine Handbook , 2011 .

[10]  K. Scherer,et al.  Cues and channels in emotion recognition. , 1986 .

[11]  Anton Nijholt,et al.  Virtual rap dancer: invitation to dance , 2006, CHI Extended Abstracts.

[12]  C. Creider Hand and Mind: What Gestures Reveal about Thought , 1994 .

[13]  Peter Robinson,et al.  Detecting Affect from Non-stylised Body Motions , 2007, ACII.

[14]  Antonio Camurri,et al.  Analysis of Expressive Gesture: The EyesWeb Expressive Gesture Processing Library , 2003, Gesture Workshop.

[15]  J. Cunningham,et al.  Children's decoding of emotion in expressive body movement: the development of cue attunement. , 1998, Developmental psychology.

[16]  Maurizio Mancini,et al.  Implementing Expressive Gesture Synthesis for Embodied Conversational Agents , 2005, Gesture Workshop.

[17]  Ana Paiva,et al.  Automatic analysis of affective postures and body motion to detect engagement with a game companion , 2011, 2011 6th ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[18]  M. D. Meijer The contribution of general features of body movement to the attribution of emotions , 1989 .

[19]  B. de Gelder Why bodies? Twelve reasons for including bodily expressions in affective neuroscience. , 2009, Philosophical transactions of the Royal Society of London. Series B, Biological sciences.

[20]  Justine Cassell,et al.  Embodied conversational interface agents , 2000, CACM.

[21]  Boone Rt,et al.  Children's Decoding of Emotion in Expressive Body Movement: The Development of Cue Attunement. , 1998 .

[22]  Klaus R. Scherer,et al.  Using Actor Portrayals to Systematically Study Multimodal Emotion Expression: The GEMEP Corpus , 2007, ACII.

[23]  Peter Robinson,et al.  Real-Time Inference of Complex Mental States from Facial Expressions and Head Gestures , 2004, 2004 Conference on Computer Vision and Pattern Recognition Workshop.

[24]  K. Dautenhahn,et al.  Trying to imitate-a step towards releasing robots from social isolation , 1994, Proceedings of PerAc '94. From Perception to Action.

[25]  L. Riek,et al.  Real-Time Empathy : Facial Mimicry on a Robot , 2008 .

[26]  Catherine Pelachaud,et al.  Multimodal expressive embodied conversational agents , 2005, ACM Multimedia.

[27]  James C. Lester,et al.  Achieving Affective Impact: Visual Emotive Communication in Lifelike Pedagogical Agents , 1999 .

[28]  Caridakis,et al.  Body gesture and facial expression analysis for automatic affect recognition , 2010 .

[29]  Anthony Steed,et al.  Automatic Recognition of Non-Acted Affective Postures , 2011, IEEE Transactions on Systems, Man, and Cybernetics, Part B (Cybernetics).

[30]  Christopher E. Peters A Perceptually-Based Theory of Mind for Agent Interaction Initiation , 2006, Int. J. Humanoid Robotics.

[31]  Ana Paiva,et al.  Detecting user engagement with a robot companion using task and social interaction-based features , 2009, ICMI-MLMI '09.

[32]  K. Höök Affective loop experiences: designing for interactional embodiment , 2009, Philosophical Transactions of the Royal Society B: Biological Sciences.

[33]  P. Ekman,et al.  Detecting deception from the body or face. , 1974 .

[34]  Christopher E. Peters,et al.  Fundamentals of Agent Perception and Attention Modelling , 2011 .

[35]  Cynthia Breazeal,et al.  Regulation and Entrainment in Human—Robot Interaction , 2000, Int. J. Robotics Res..

[36]  Stacy Marsella,et al.  Natural Behavior of a Listening Agent , 2005, IVA.

[37]  Ginevra Castellano,et al.  Recognising Human Emotions from Body Movement and Gesture Dynamics , 2007, ACII.

[38]  C. Breazeal Role of expressive behaviour for robots that learn from people , 2009, Philosophical Transactions of the Royal Society B: Biological Sciences.

[39]  Amit Konar,et al.  Emotion Recognition From Facial Expressions and Its Control Using Fuzzy Logic , 2009, IEEE Transactions on Systems, Man, and Cybernetics - Part A: Systems and Humans.

[40]  Hatice Gunes,et al.  Automatic Temporal Segment Detection and Affect Recognition From Face and Body Display , 2009, IEEE Transactions on Systems, Man, and Cybernetics, Part B (Cybernetics).

[41]  A. van Knippenberg,et al.  Mimicry and Prosocial Behavior , 2004, Psychological science.

[42]  Maurizio Mancini,et al.  Analysis of Emotional Gestures for the Generation of Expressive Copying Behaviour in an Embodied Agent , 2009, Gesture Workshop.

[43]  Timothy W. Bickmore,et al.  Establishing and maintaining long-term human-computer relationships , 2005, TCHI.

[44]  Antonio Camurri,et al.  Multimodal Analysis of Expressive Gesture in Music and Dance Performances , 2003, Gesture Workshop.

[45]  Stefan Kopp,et al.  Imitation Games with an Artificial Agent: From Mimicking to Understanding Shape-Related Iconic Gestures , 2003, Gesture Workshop.

[46]  Ramón Galán,et al.  An Emotional Model for a Guide Robot , 2010, IEEE Transactions on Systems, Man, and Cybernetics - Part A: Systems and Humans.

[47]  K. Scherer,et al.  Introducing the Geneva Multimodal Emotion Portrayal (GEMEP) corpus , 2010 .

[48]  R. Byrne,et al.  Priming primates: Human and otherwise , 1998, Behavioral and Brain Sciences.

[49]  Antonio Camurri,et al.  Developing multimodal interactive systems with EyesWeb XMI , 2007, NIME '07.

[50]  Zhihong Zeng,et al.  A Survey of Affect Recognition Methods: Audio, Visual, and Spontaneous Expressions , 2009, IEEE Trans. Pattern Anal. Mach. Intell..

[51]  Maurizio Mancini,et al.  Design and evaluation of expressive gesture synthesis for embodied conversational agents , 2005, AAMAS '05.