Communication of Emotion in Social Robots through Simple Head and Arm Movements

Understanding how people perceive robot gestures will aid the design of robots capable of social interaction with humans. We examined the generation and perception of a restricted form of gesture in a robot capable of simple head and arm movement, referring to point-light animation and video experiments in human motion to derive our hypotheses. Four studies were conducted to look at the effects of situational context, gesture complexity, emotional valence and author expertise. In Study 1, four participants created gestures with corresponding emotions based on 12 scenarios provided. The resulting gestures were judged by 12 participants in a second study. Participants’ recognition of emotion was better than chance and improved when situational context was provided. Ratings of lifelikeness were found to be related to the number of arm movements (but not head movements) in a gesture. In Study 3, five novices and five puppeteers created gestures conveying Ekman’s six basic emotions which were shown to 12 Study 4 participants. Puppetry experience improved identification rates only for the emotions of fear and disgust, possibly because of limitations with the robot’s movement. The results demonstrate the communication of emotion by a social robot capable of only simple head and arm movement.

[1]  George Latshaw The Complete Book of Puppetry , 2000 .

[2]  鈴木 聡 Media Equation 研究の背景と動向 , 2011 .

[3]  Paul J. Feltovich,et al.  Categorization and Representation of Physics Problems by Experts and Novices , 1981, Cogn. Sci..

[4]  Tetsuo Ono,et al.  Body Movement Analysis of Human-Robot Interaction , 2003, IJCAI.

[5]  David Levy,et al.  Intimate Relationships With Artificial Partners , 2007 .

[6]  T. Kanda,et al.  Altered Attitudes of People toward Robots : Investigation through the Negative Attitudes toward Robots Scale ∗ , 2006 .

[7]  Patrice D. Tremoulet,et al.  Perception of Animacy from the Motion of a Single Object , 2000, Perception.

[8]  Kristinn R. Thórisson,et al.  The Power of a Nod and a Glance: Envelope Vs. Emotional Feedback in Animated Conversational Agents , 1999, Appl. Artif. Intell..

[9]  Justine Cassell,et al.  Embodied conversational interface agents , 2000, CACM.

[10]  Jordan Zlatev,et al.  The Epigenesis of Meaning in Human Beings, and Possibly in Robots , 2001, Minds and Machines.

[11]  M. Shiffrar,et al.  Recognizing people from their movement. , 2005, Journal of experimental psychology. Human perception and performance.

[12]  David J. Sturman,et al.  Computer Puppetry , 1998, IEEE Computer Graphics and Applications.

[13]  Louis-Philippe Morency,et al.  The effect of head-nod recognition in human-robot conversation , 2006, HRI '06.

[14]  Toshio Fukuda,et al.  Department of Mechanical and Industrial Engineering, University of Toronto,5 King’s College Road, Toronto, ON, M5S 3G8, Canada , 2008 .

[15]  S. Gelman,et al.  Mapping the Mind: Domain Specificity In Cognition And Culture , 1994 .

[16]  Frank Biocca,et al.  The Cyborg's Dilemma: Progressive Embodiment in Virtual Environments , 2006, J. Comput. Mediat. Commun..

[17]  Tatsuya Nomura,et al.  Comparison on Identification of Affective Body Motions by Robots Between Elder People and University Students: A Case Study in Japan , 2010, Int. J. Soc. Robotics.

[18]  P. Ekman,et al.  Emotion in the Human Face: Guidelines for Research and an Integration of Findings , 1972 .

[19]  E. Blumenthal,et al.  Puppetry: A World History , 2005 .

[20]  Susan Bell Trickett,et al.  The Relationship Between Spatial Transformations and Iconic Gestures , 2006, Spatial Cogn. Comput..

[21]  John E. Opfer,et al.  Identifying living and sentient kinds from dynamic information: the case of goal-directed versus aimless autonomous movement in conceptual change , 2002, Cognition.

[22]  M. Argyle The Psychology of Interpersonal Behaviour , 1967 .

[23]  A. Takanishi,et al.  Various emotional expressions with emotion expression humanoid robot WE-4RII , 2004, IEEE Conference on Robotics and Automation, 2004. TExCRA Technical Exhibition Based..

[24]  H. Ishiguro,et al.  A Model of Embodied Communications with Gestures between Human and Robots , 2001 .

[25]  Luís Paulo Reis,et al.  Biometric Emotion Assessment and Feedback in an Immersive Digital Environment , 2009, Int. J. Soc. Robotics.

[26]  P. Ekman,et al.  The Repertoire of Nonverbal Behavior: Categories, Origins, Usage, and Coding , 1969 .

[27]  Hiroshi Ishiguro,et al.  Motion modification method to control affective nuances for robots , 2009, 2009 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[28]  Illah R. Nourbakhsh,et al.  A survey of socially interactive robots , 2003, Robotics Auton. Syst..

[29]  Cory D. Kidd,et al.  Comparison of Social Presence in Robots and Animated Characters , 2002 .

[30]  D. McNeill Gesture and Thought , 2005 .

[31]  G. Beattie Visible Thought: the new psychology of body language , 2007 .

[32]  Jeffrey C. Trinkle,et al.  'Outside-in" Design for Interdisciplinary HRI Research , 2009, AAAI Spring Symposium: Experimental Design for Real-World Systems.

[33]  John D. Bransford,et al.  Gender Representation and Humanoid Robots Designed for Domestic Use , 2009, Int. J. Soc. Robotics.

[34]  S. Lea,et al.  Perception of Emotion from Dynamic Point-Light Displays Represented in Dance , 1996, Perception.

[35]  I-Ming Chen,et al.  Design expressive behaviors for robotic puppet , 2002, 7th International Conference on Control, Automation, Robotics and Vision, 2002. ICARCV 2002..

[36]  M. Sawada,et al.  Expression of Emotions in Dance: Relation between Arm Movement Characteristics and Emotion , 2003, Perceptual and motor skills.

[37]  Chrystopher L. Nehaniv Classifying types of gesture and inferring intent , 2005 .

[38]  Kazuhiro Ueda,et al.  Interaction with a Moving Object Affects One’s Perception of Its Animacy , 2010, Int. J. Soc. Robotics.

[39]  Ming-Hsiang Su,et al.  A user-oriented framework for the design and implementation of pet robots , 2004, 2004 IEEE International Conference on Systems, Man and Cybernetics (IEEE Cat. No.04CH37583).

[40]  Jeffrey C. Trinkle,et al.  ShadowPlay: A generative model for nonverbal human-robot interaction , 2009, 2009 4th ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[41]  Carey K. Morewedge,et al.  Timescale bias in the attribution of mind. , 2007, Journal of personality and social psychology.

[42]  D. Premack The infant's theory of self-propelled objects , 1990, Cognition.

[43]  Beatrice de Gelder,et al.  I Feel Your Voice , 2010, Psychological science.

[44]  A. Young,et al.  Emotion Perception from Dynamic and Static Body Expressions in Point-Light and Full-Light Displays , 2004, Perception.

[45]  Cynthia Breazeal,et al.  Toward sociable robots , 2003, Robotics Auton. Syst..

[46]  B. de Gelder,et al.  Social context influences recognition of bodily expressions , 2010, Experimental Brain Research.

[47]  Atsuo Takanishi,et al.  Basic emotional walking using a biped humanoid robot , 1999, IEEE SMC'99 Conference Proceedings. 1999 IEEE International Conference on Systems, Man, and Cybernetics (Cat. No.99CH37028).

[48]  R. Blake,et al.  Perception of human motion. , 2007, Annual review of psychology.

[49]  A. Leslie Mapping the mind: ToMM, ToBY, and Agency: Core architecture and domain specificity , 1994 .

[50]  Jessica K. Hodgins,et al.  Perception of Human Motion With Different Geometric Models , 1998, IEEE Trans. Vis. Comput. Graph..

[51]  D. Rakison,et al.  Developmental origin of the animate-inanimate distinction. , 2001, Psychological bulletin.

[52]  H. Schlosberg Three dimensions of emotion. , 1954, Psychological review.

[53]  Ipke Wachsmuth,et al.  Embodied Communication in Humans and Machines , 2008, AI Mag..

[54]  K. M. Lee,et al.  Can robots manifest personality? : An empirical test of personality recognition, social responses, and social presence in human-robot interaction , 2006 .

[55]  T. J. Clarke,et al.  The Perception of Emotion from Body Movement in Point-Light Displays of Interpersonal Dialogue , 2005, Perception.

[56]  L. Kaufman,et al.  Distinguishing Between Animates And Inanimates: Not By Motion Alone , 1995 .

[57]  J. Montepare,et al.  The Use of Body Movements and Gestures as Cues to Emotions in Younger and Older Adults , 1999 .

[58]  Armin Bruderlin,et al.  Perceiving affect from arm movement , 2001, Cognition.

[59]  B. Depaulo,et al.  Sex differences in eavesdropping on nonverbal cues. , 1979 .

[60]  Allison Druin,et al.  A storytelling robot for pediatric rehabilitation , 2000, Assets '00.

[61]  Samuel Fillenbaum,et al.  Psycholinguistics: A New Approach , 1987 .

[62]  H. Simon,et al.  Perception in chess , 1973 .

[63]  Hiroshi Mizoguchi,et al.  Realization of Expressive Mobile Robot , 1997, Proceedings of International Conference on Robotics and Automation.

[64]  M. D. Meijer The contribution of general features of body movement to the attribution of emotions , 1989 .

[65]  J. N. Bassili Temporal and spatial contingencies in the perception of social events , 1976 .

[66]  A. Atkinson,et al.  Evidence for distinct contributions of form and motion information to the recognition of emotions from body gestures , 2007, Cognition.