Perception of affect elicited by robot motion

Nonverbal behaviors serve as a rich source of information in inter human communication. In particular, motion cues can reveal details on a person's current physical and mental state. Research has shown, that people do not only interpret motion cues of humans in these terms, but also the motion of animals and inanimate devices such as robots. In order to successfully integrate mobile robots in domestic environments, designers have therefore to take into account how the device will be perceived by the user. In this study we analyzed the relationship between motion characteristics of a robot and perceived affect. Based on a literature study we selected two motion characteristics, namely acceleration and curvature, which appear to be most influential for how motion is perceived. We systematically varied these motion parameters and recorded participants interpretations in terms of affective content. Our results suggest a strong relation between motion parameters and attribution of affect, while the type of embodiment had no effect. Furthermore, we found that the level of acceleration can be used to predict perceived arousal and that valence information is at least partly encoded in an interaction between acceleration and curvature. These findings are important for the design of behaviors for future autonomous household robots.

[1]  F. Heider,et al.  An experimental study of apparent behavior , 1944 .

[2]  P. Ekman,et al.  Unmasking the face : a guide to recognizing emotions from facial clues , 1975 .

[3]  J. Russell A circumplex model of affect. , 1980 .

[4]  D. Watson,et al.  Development and validation of brief measures of positive and negative affect: the PANAS scales. , 1988, Journal of personality and social psychology.

[5]  R. Larsen,et al.  Promises and problems with the circumplex model of emotion. , 1992 .

[6]  M. Bradley,et al.  Measuring emotion: the Self-Assessment Manikin and the Semantic Differential. , 1994, Journal of behavior therapy and experimental psychiatry.

[7]  A. Mehrabian Framework for a comprehensive description and measurement of emotional states. , 1995, Genetic, social, and general psychology monographs.

[8]  A. Mehrabian Comparison of the PAD and PANAS as models for describing emotions and for differentiating anxiety from depression , 1997 .

[9]  Kerstin Dautenhahn,et al.  I Could Be You: the Phenomenological Dimension of Social Understanding , 1997, Cybern. Syst..

[10]  Hiroshi Mizoguchi,et al.  Realization of Expressive Mobile Robot , 1997, Proceedings of International Conference on Robotics and Automation.

[11]  P. Todd,et al.  How motion reveals intention: Categorizing social interactions , 1999 .

[12]  P. Todd,et al.  Simple Heuristics That Make Us Smart , 1999 .

[13]  P. Lang,et al.  International Affective Picture System (IAPS): Instruction Manual and Affective Ratings (Tech. Rep. No. A-4) , 1999 .

[14]  Patrice D. Tremoulet,et al.  Perceptual causality and animacy , 2000, Trends in Cognitive Sciences.

[15]  Patrice D. Tremoulet,et al.  Perception of Animacy from the Motion of a Single Object , 2000, Perception.

[16]  Armin Bruderlin,et al.  Perceiving affect from arm movement , 2001, Cognition.

[17]  George N. Votsis,et al.  Emotion recognition in human-computer interaction , 2001, IEEE Signal Process. Mag..

[18]  Antonio Camurri,et al.  Recognizing emotion from dance movement: comparison of spectator recognition and automated techniques , 2003, Int. J. Hum. Comput. Stud..

[19]  J. Henry,et al.  The positive and negative affect schedule (PANAS): construct validity, measurement properties and normative data in a large non-clinical sample. , 2004, The British journal of clinical psychology.

[20]  Jodi Forlizzi,et al.  Service robots in the domestic environment: a study of the roomba vacuum in the home , 2006, HRI '06.

[21]  Amandine Grizard,et al.  Generation of facial emotional expressions based on Scherer psychological theory , 2006 .

[22]  Henrik I. Christensen,et al.  "My Roomba Is Rambo": Intimate Home Appliances , 2007, UbiComp.

[23]  B. Mesquita,et al.  The experience of emotion. , 2007, Annual review of psychology.

[24]  Tek-Jin Nam,et al.  Emotional Interaction Through Physical Movement , 2007, HCI.

[25]  Martin Saerbeck,et al.  Design guidelines and tools for creating believable motion for personal robots , 2007, RO-MAN 2007 - The 16th IEEE International Symposium on Robot and Human Interactive Communication.

[26]  J. Humrichouse,et al.  Affect assessment through self-report methods. , 2007 .

[27]  Henrik I. Christensen,et al.  Housewives or technophiles?: Understanding domestic robot owners , 2008, 2008 3rd ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[28]  D Echeverria,et al.  Assessing social-emotional development in children from a longitudinal perspective , 2008, Journal of Epidemiology & Community Health.

[29]  Robin R. Murphy,et al.  Survey of Non-facial/Non-verbal Affective Expressions for Appearance-Constrained Robots , 2008, IEEE Transactions on Systems, Man, and Cybernetics, Part C (Applications and Reviews).

[30]  Takashi Minato,et al.  Construction and evaluation of a model of natural human motion based on motion diversity , 2008, 2008 3rd ACM/IEEE International Conference on Human-Robot Interaction (HRI).