Automatic processing of irrelevant co-speech gestures with human but not robot actors

Non-verbal, or visual, communication is an important factor of daily human-to-human interaction. Gestures make up one mode of visual communication, where movement of the body is used to convey a message either alone or in conjunction with speech. The purpose of this experiment is to explore how humans perceive gestures made by a humanoid robot compared to the same gestures made by a human. We do this by adapting and replicating a human perceptual experiment by Kelly et al., where a Stroop-like task was used to demonstrate the automatic processing of gesture and speech together. 59 college students participated in our experiment. Our results support the notion that automatic gesture processing occurs when interacting with human actors, but not robot actors. We discuss the implications of these findings for the HRI community.

[1]  Jacob Cohen Statistical Power Analysis for the Behavioral Sciences , 1969, The SAGE Encyclopedia of Research Design.

[2]  Hallee Pitterman,et al.  A Test of the Ability to Identify Emotion in Human Standing and Sitting Postures: The Diagnostic Analysis of Nonverbal Accuracy-2 Posture Test (DANVA2-POS) , 2004, Genetic, social, and general psychology monographs.

[3]  C. Pelachaud,et al.  GRETA. A BELIEVABLE EMBODIED CONVERSATIONAL AGENT , 2005 .

[4]  F. Eyssel,et al.  (S)he's Got the Look: Gender Stereotyping of Robots1 , 2012 .

[5]  Justin W. Hart,et al.  No fair!!: an interaction with a cheating robot , 2010, HRI 2010.

[6]  Sukhan Lee,et al.  Robot gesture and user acceptance of information in human-robot interaction , 2012, 2012 7th ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[7]  James E. Katz,et al.  Unveiling robotophobia and Cyber-dystopianism: The role of gender, technology and religion on attitudes towards robots , 2012, 2012 7th ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[8]  E. Vincent Cross,et al.  Explaining robot actions , 2012, 2012 7th ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[9]  J. Bavelas,et al.  Gestures Specialized for Dialogue , 1995 .

[10]  Hiroshi Ishiguro,et al.  Generation of nodding, head tilting and eye gazing for human-robot dialogue interaction , 2012, 2012 7th ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[11]  Hanafiah Yussof,et al.  Robot-based Intervention Program for Autistic Children with Humanoid Robot NAO: Initial Response in , 2012 .

[12]  Peter Robinson,et al.  Cooperative gestures: Effective signaling for humanoid robots , 2010, 2010 5th ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[13]  Cynthia Breazeal,et al.  Emotion and sociable humanoid robots , 2003, Int. J. Hum. Comput. Stud..

[14]  Brian Scassellati,et al.  No fair!! An interaction with a cheating robot , 2010, 2010 5th ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[15]  Bilge Mutlu,et al.  Designing persuasive robots: How robots might persuade people using vocal and nonverbal cues , 2012, 2012 7th ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[16]  Ivana Kruijff-Korbayová,et al.  Towards Learning Human-Robot Dialogue Policies Combining Speech and Visual Beliefs , 2011, IWSDS.

[17]  T. Kanda,et al.  Measurement of negative attitudes toward robots , 2006 .

[18]  James Bartolotti,et al.  Integrating Speech and Iconic Gestures in a Stroop-like Task: Evidence for Automatic Processing , 2010, Journal of Cognitive Neuroscience.

[19]  Charles R. Crowelly,et al.  Gendered voice and robot entities: Perceptions and reactions of male and female subjects , 2009 .

[20]  Xavier Pouteau,et al.  Multimodal communication in virtual environments , 1995 .

[21]  A. Kendon Gesture: Visible Action as Utterance , 2004 .

[22]  S. Shamsuddin,et al.  Initial response of autistic children in human-robot interaction therapy with humanoid robot NAO , 2012, 2012 IEEE 8th International Colloquium on Signal Processing and its Applications.

[23]  S. L. See,et al.  Applying politeness maxims in social robotics polite dialogue , 2012, HRI 2012.

[24]  Zeshu Shao,et al.  The Role of Synchrony and Ambiguity in Speech–Gesture Integration during Comprehension , 2011, Journal of Cognitive Neuroscience.