Perception of affect elicited by robot motion

Nonverbal behaviors serve as a rich source of information in inter human communication. In particular, motion cues can reveal details on a person's current physical and mental state. Research has shown, that people do not only interpret motion cues of humans in these terms, but also the motion of animals and inanimate devices such as robots. In order to successfully integrate mobile robots in domestic environments, designers have therefore to take into account how the device will be perceived by the user. In this study we analyzed the relationship between motion characteristics of a robot and perceived affect. Based on a literature study we selected two motion characteristics, namely acceleration and curvature, which appear to be most influential for how motion is perceived. We systematically varied these motion parameters and recorded participants interpretations in terms of affective content. Our results suggest a strong relation between motion parameters and attribution of affect, while the type of embodiment had no effect. Furthermore, we found that the level of acceleration can be used to predict perceived arousal and that valence information is at least partly encoded in an interaction between acceleration and curvature. These findings are important for the design of behaviors for future autonomous household robots.

[1]  A. Mehrabian Framework for a comprehensive description and measurement of emotional states. , 1995, Genetic, social, and general psychology monographs.

[2]  Author “ My Roomba is Rambo ” : Intimate Home Appliances , 2007 .

[3]  M. Bradley,et al.  Emotion and psychopathology: a startle probe analysis. , 1993, Progress in experimental personality & psychopathology research.

[4]  B. Mesquita,et al.  The experience of emotion. , 2007, Annual review of psychology.

[5]  Christine L. Lisetti,et al.  Generation of Facial Emotional Expressions Based on Psychological Theory , 2006 .

[6]  Henrik I. Christensen,et al.  Housewives or technophiles?: Understanding domestic robot owners , 2008, 2008 3rd ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[7]  Martin Saerbeck,et al.  Design guidelines and tools for creating believable motion for personal robots , 2007, RO-MAN 2007 - The 16th IEEE International Symposium on Robot and Human Interactive Communication.

[8]  Takashi Minato,et al.  Construction and evaluation of a model of natural human motion based on motion diversity , 2008, 2008 3rd ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[9]  J. Humrichouse,et al.  Affect assessment through self-report methods. , 2007 .

[10]  Patrice D. Tremoulet,et al.  Perception of Animacy from the Motion of a Single Object , 2000, Perception.

[11]  R. Larsen,et al.  Promises and problems with the circumplex model of emotion. , 1992 .

[12]  D Echeverria,et al.  Assessing social-emotional development in children from a longitudinal perspective , 2008, Journal of Epidemiology & Community Health.

[13]  F. Heider,et al.  An experimental study of apparent behavior , 1944 .

[14]  L. F. Barrett,et al.  Handbook of Emotions , 1993 .

[15]  J. Russell A circumplex model of affect. , 1980 .

[16]  A. Mehrabian Comparison of the PAD and PANAS as models for describing emotions and for differentiating anxiety from depression , 1997 .

[17]  Paul Verspaget,et al.  The work presented in this thesis was carried out under the auspices of the J.F. Schouten school for User-System Interaction Research. , 2004 .

[18]  P. Todd,et al.  How motion reveals intention: Categorizing social interactions , 1999 .

[19]  P. Ekman,et al.  Unmasking the face : a guide to recognizing emotions from facial clues , 1975 .

[20]  M. Bradley,et al.  Measuring emotion: the Self-Assessment Manikin and the Semantic Differential. , 1994, Journal of behavior therapy and experimental psychiatry.

[21]  Viksit Gaur,et al.  Which Motion Features Induce the Perception of Animacy ? , 2006 .

[22]  Jodi Forlizzi,et al.  Service robots in the domestic environment: a study of the roomba vacuum in the home , 2006, HRI '06.

[23]  Tek-Jin Nam,et al.  Emotional Interaction Through Physical Movement , 2007, HCI.

[24]  Kerstin Dautenhahn,et al.  I Could Be You: the Phenomenological Dimension of Social Understanding , 1997, Cybern. Syst..

[25]  Antonio Camurri,et al.  Recognizing emotion from dance movement: comparison of spectator recognition and automated techniques , 2003, Int. J. Hum. Comput. Stud..

[26]  Hiroshi Mizoguchi,et al.  Realization of Expressive Mobile Robot , 1997, Proceedings of International Conference on Robotics and Automation.

[27]  Robin R. Murphy,et al.  Survey of Non-facial/Non-verbal Affective Expressions for Appearance-Constrained Robots , 2008, IEEE Transactions on Systems, Man, and Cybernetics, Part C (Applications and Reviews).

[28]  J. Henry,et al.  The positive and negative affect schedule (PANAS): construct validity, measurement properties and normative data in a large non-clinical sample. , 2004, The British journal of clinical psychology.

[29]  Patrice D. Tremoulet,et al.  Perceptual causality and animacy , 2000, Trends in Cognitive Sciences.

[30]  P. Lang,et al.  International Affective Picture System (IAPS): Instruction Manual and Affective Ratings (Tech. Rep. No. A-4) , 1999 .

[31]  Armin Bruderlin,et al.  Perceiving affect from arm movement , 2001, Cognition.

[32]  Mann Oo. Hay Emotion recognition in human-computer interaction , 2012 .

[33]  P. Todd,et al.  Simple Heuristics That Make Us Smart , 1999 .

[34]  D. Watson,et al.  Development and validation of brief measures of positive and negative affect: the PANAS scales. , 1988, Journal of personality and social psychology.