In Good Company? Perception of Movement Synchrony of a Non-Anthropomorphic Robot

Recent technological developments like cheap sensors and the decreasing costs of computational power have brought the possibility of robotic home companions within reach. In order to be accepted it is vital for these robots to be able to participate meaningfully in social interactions with their users and to make them feel comfortable during these interactions. In this study we investigated how people respond to a situation where a companion robot is watching its user. Specifically, we tested the effect of robotic behaviours that are synchronised with the actions of a human. We evaluated the effects of these behaviours on the robot’s likeability and perceived intelligence using an online video survey. The robot used was Care-O-bot3, a non-anthropomorphic robot with a limited range of expressive motions. We found that even minimal, positively synchronised movements during an object-oriented task were interpreted by participants as engagement and created a positive disposition towards the robot. However, even negatively synchronised movements of the robot led to more positive perceptions of the robot, as compared to a robot that does not move at all. The results emphasise a) the powerful role that robot movements in general can have on participants’ perception of the robot, and b) that synchronisation of body movements can be a powerful means to enhance the positive attitude towards a non-anthropomorphic robot.

[1]  Kerstin Dautenhahn,et al.  What is a robot companion - friend, assistant or butler? , 2005, 2005 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[2]  Anthony G. Pipe,et al.  Perception of own and robot engagement in human-robot interactions and their dependence on robotics knowledge , 2014, Robotics Auton. Syst..

[3]  Xiangyu Wang,et al.  Mixed Reality and Human-Robot Interaction , 2011 .

[4]  Hiroshi Ishiguro,et al.  Composition and Evaluation of the Humanlike Motions of an Android , 2006, 2006 6th IEEE-RAS International Conference on Humanoid Robots.

[5]  Patrice D. Tremoulet,et al.  Perceptual causality and animacy , 2000, Trends in Cognitive Sciences.

[6]  Kerstin Fischer,et al.  The impact of the contingency of robot feedback on HRI , 2013, 2013 International Conference on Collaboration Technologies and Systems (CTS).

[7]  Kerstin Dautenhahn,et al.  Video prototyping of dog-inspired non-verbal affective communication for an appearance constrained robot , 2010, 19th International Symposium in Robot and Human Interactive Communication.

[8]  Marc Hanheide,et al.  Evaluating extrovert and introvert behaviour of a domestic robot — a video study , 2008, RO-MAN 2008 - The 17th IEEE International Symposium on Robot and Human Interactive Communication.

[9]  U. Hadar,et al.  Head movement during listening turns in conversation , 1985 .

[10]  Patrice D. Tremoulet,et al.  Perception of Animacy from the Motion of a Single Object , 2000, Perception.

[11]  Sachiko Amano,et al.  Infant shifting attention from an adult’s face to an adult’s hand: a precursor of joint attention , 2004 .

[12]  Tieniu Tan,et al.  Advances in Multimodal Interfaces — ICMI 2000 , 2001, Lecture Notes in Computer Science.

[13]  Kerstin Dautenhahn,et al.  Care-O-bot® 3 - Vision of a Robot Butler , 2013, Your Virtual Butler.

[14]  Gabriel Skantze,et al.  Exploring the effects of gaze and pauses in situated human-robot interaction , 2013, SIGDIAL Conference.

[15]  Ferdinando Fornara,et al.  Robots in a domestic setting: a psychological approach , 2005, Universal Access in the Information Society.

[16]  Ari Weinstein,et al.  Perception of intentions and mental states in autonomous virtual agents , 2011, CogSci.

[17]  Thomas B. Sheridan,et al.  Human–Robot Interaction , 2016, Hum. Factors.

[18]  L. Caulfield,et al.  Zinc supplementation sustained normative neurodevelopment in a randomized, controlled trial of Peruvian infants aged 6-18 months. , 2014, The Journal of nutrition.

[19]  P. Langdon,et al.  Designing a More Inclusive World , 2012 .

[20]  Marc Hanheide,et al.  Evaluating the Robot Personality and Verbal Behavior of Domestic Robots Using Video-Based Studies , 2011, Adv. Robotics.

[21]  M.,et al.  EVALUATING THE BEHAVIOUR OF DOMESTIC ROBOTS USING VIDEO-BASED STUDIES , 2009 .

[22]  Ronald C. Arkin,et al.  TAME: Time-Varying Affective Response for Humanoid Robots , 2011, Int. J. Soc. Robotics.

[23]  Heloir,et al.  The Uncanny Valley , 2019, The Animation Studies Reader.

[24]  Brian Scassellati,et al.  The effect of presence on human-robot interaction , 2008, RO-MAN 2008 - The 17th IEEE International Symposium on Robot and Human Interactive Communication.

[25]  Steven M Boker,et al.  Behavioral synchronization in human conversational interaction , 2002 .

[26]  Kerstin Dautenhahn,et al.  I Could Be You: the Phenomenological Dimension of Social Understanding , 1997, Cybern. Syst..

[27]  G. Mcnicoll World Population Ageing 1950-2050. , 2002 .

[28]  I. René J. A. te Boekhorst,et al.  Avoiding the uncanny valley: robot appearance, personality and consistency of behavior in an attention-seeking home scenario for a robot companion , 2008, Auton. Robots.

[29]  Bruce Christianson,et al.  Artists as HRI Pioneers: A Creative Approach to Developing Novel Interactions for Living with Robots , 2013, ICSR.

[30]  Kerstin Dautenhahn,et al.  Social Roles and Baseline Proxemic Preferences for a Domestic Service Robot , 2014, Int. J. Soc. Robotics.

[31]  Cristina Conati,et al.  Eye Gaze in Intelligent User Interfaces , 2013, Springer London.

[32]  Yuichiro Yoshikawa,et al.  The effects of responsive eye movement and blinking behavior in a communication robot , 2006, 2006 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[33]  F. Heider,et al.  An experimental study of apparent behavior , 1944 .

[34]  Kerstin Dautenhahn,et al.  Socially intelligent robots: dimensions of human–robot interaction , 2007, Philosophical Transactions of the Royal Society B: Biological Sciences.

[35]  Roger K. Moore A Bayesian explanation of the ‘Uncanny Valley’ effect and related psychological phenomena , 2012, Scientific Reports.

[36]  A. Meltzoff Understanding the Intentions of Others: Re-Enactment of Intended Acts by 18-Month-Old Children. , 1995, Developmental psychology.

[37]  Anton Nijholt,et al.  Jacob - An Animated Instruction Agent in Virtual Reality , 2000, ICMI.

[38]  K. Dautenhahn,et al.  Comparing human robot interaction scenarios using live and video based methods: towards a novel methodological approach , 2006, 9th IEEE International Workshop on Advanced Motion Control, 2006..

[39]  U. Frith,et al.  Do triangles play tricks? Attribution of mental states to animated shapes in normal and abnormal development , 2000 .

[40]  James C. Lester,et al.  The persona effect: affective impact of animated pedagogical agents , 1997, CHI.

[41]  Virginia Slaughter,et al.  Emergence of Joint Attention: Relationships Between Gaze Following, Social Referencing, Imitation, and Naming in Infancy , 2003, The Journal of genetic psychology.

[42]  H. Ishiguro,et al.  The uncanny advantage of using androids in cognitive and social science research , 2006 .

[43]  M. Scopelliti,et al.  If I had a Robot at Home... Peoples’ Representation of Domestic Robots , 2004 .

[44]  H. Ishiguro,et al.  Opening Pandora’s Box , 2020, Marriage Equality.

[45]  Dana Kulic,et al.  Measurement Instruments for the Anthropomorphism, Animacy, Likeability, Perceived Intelligence, and Perceived Safety of Robots , 2009, Int. J. Soc. Robotics.

[46]  M. Tomasello,et al.  Reliance on head versus eyes in the gaze following of great apes and human infants: the cooperative eye hypothesis. , 2007, Journal of human evolution.

[47]  C. Moore,et al.  Joint attention : its origins and role in development , 1995 .

[48]  Kerstin Dautenhahn,et al.  Methodological Issues in HRI: A Comparison of Live and Video-Based Methods in Robot to Human Approach Direction Trials , 2006, ROMAN 2006 - The 15th IEEE International Symposium on Robot and Human Interactive Communication.

[49]  Sean Andrist,et al.  Conversational Gaze Aversion for Humanlike Robots , 2014, 2014 9th ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[50]  Claude Ghaoui,et al.  Encyclopedia of Human Computer Interaction , 2005 .

[51]  Bruce A. MacDonald,et al.  Age and gender factors in user acceptance of healthcare robots , 2009, RO-MAN 2009 - The 18th IEEE International Symposium on Robot and Human Interactive Communication.

[52]  Robert Trappl,et al.  Your Virtual Butler , 2013, Lecture Notes in Computer Science.

[53]  Lili Ma,et al.  Preverbal infants infer intentional agents from the perception of regularity. , 2013, Developmental psychology.

[54]  Kerstin Dautenhahn,et al.  Views from Within a Narrative: Evaluating Long-Term Human–Robot Interaction in a Naturalistic Environment Using Open-Ended Scenarios , 2014, Cognitive Computation.

[55]  Kerstin Dautenhahn,et al.  Companion Migration – Initial Participants’ Feedback from a Video-Based Prototyping Study , 2011 .

[56]  Kerstin Dautenhahn,et al.  Development of the sociability of non-anthropomorphic robot home companions , 2014, 4th International Conference on Development and Learning and on Epigenetic Robotics.

[57]  G. Csibra,et al.  'Obsessed with goals': functions and mechanisms of teleological interpretation of actions in humans. , 2007, Acta psychologica.

[58]  Luisa Damiano,et al.  Towards Human–Robot Affective Co-evolution Overcoming Oppositions in Constructing Emotions and Empathy , 2014, International Journal of Social Robotics.

[59]  Vittorio Gallese,et al.  Mirror Neurons and the Evolution of Brain and Language , 2002 .

[60]  Stefan Kopp,et al.  Generation and Evaluation of Communicative Robot Gesture , 2012, Int. J. Soc. Robotics.

[61]  Ipke Wachsmuth,et al.  Max - A Multimodal Assistant in Virtual Reality Construction , 2003, Künstliche Intell..

[62]  Frank Broz,et al.  Automated Analysis of Mutual Gaze in Human Conversational Pairs , 2013, Eye Gaze in Intelligent User Interfaces.

[63]  Sang Ryong Kim,et al.  Are physically embodied social agents better than disembodied social agents?: The effects of physical embodiment, tactile interaction, and people's loneliness in human-robot interaction , 2006, Int. J. Hum. Comput. Stud..

[64]  Z. Nadasdy,et al.  Taking the intentional stance at 12 months of age , 1995, Cognition.

[65]  Takayuki Kanda,et al.  How contingent should a lifelike robot be? The relationship between contingency and complexity , 2007, Connect. Sci..

[66]  A. Aron,et al.  Inclusion of Other in the Self Scale and the structure of interpersonal closeness , 1992 .

[67]  Charles R. Crowell,et al.  Robot social presence and gender: Do females view robots differently than males? , 2008, 2008 3rd ACM/IEEE International Conference on Human-Robot Interaction (HRI).