Efficiency of speech and iconic gesture integration for robotic and human communicators - a direct comparison

Co-verbal gestures are an important part of human communication, improving its efficiency for information conveyance. A key component of such improvement is the observer's ability to integrate information from the two communication channels, speech and gesture. Whether such integration also occurs when the multi-modal communication information is produced by a humanoid robot, and whether it is as efficient as for a human communicator, is an open question. Here, we present an experiment which, using a fully within subjects design, shows that for a range of iconic gestures, speech and gesture integration occurs with similar efficiency for human and for robot communicators. The gestures for this study were produced on an Aldebaran Robotics NAO robot platform with a Kinect based tele-operation system. We also show that our system is able to produce a range of iconic gestures that are understood by participants in unimodal (gesture only) communication, as well as being efficiently integrated with speech. Hence, we demonstrate the utility of iconic gestures for robotic communicators.

[1]  Stefan Kopp,et al.  To Err is Human(-like): Effects of Robot Gesture on Perceived Anthropomorphism and Likability , 2013, International Journal of Social Robotics.

[2]  Peter Robinson,et al.  Cooperative gestures: Effective signaling for humanoid robots , 2010, 2010 5th ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[3]  Guillaume Gibert,et al.  What makes human so different ? Analysis of human-humanoid robot interaction with a super Wizard of Oz platform , 2013 .

[4]  Bilge Mutlu,et al.  Learning-Based Modeling of Multimodal Behaviors for Humanlike Robots , 2014, 2014 9th ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[5]  Mark H. Chignell,et al.  Emotions and Messages in Simple Robot Gestures , 2009, HCI.

[6]  Zhengchen Zhang,et al.  Telerobotic Pointing Gestures Shape Human Spatial Cognition , 2012, Int. J. Soc. Robotics.

[7]  Nicole C. Krämer,et al.  "It doesn't matter what you are!" Explaining social effects of agents and avatars , 2010, Comput. Hum. Behav..

[8]  Morgan Quigley,et al.  ROS: an open-source Robot Operating System , 2009, ICRA 2009.

[9]  Lin Wang,et al.  The role of beat gesture and pitch accent in semantic processing: An ERP study , 2013, Neuropsychologia.

[10]  A. Kendon Gesture: Visible Action as Utterance , 2004 .

[11]  Elena Torta,et al.  Effects of Eye Contact and Iconic Gestures on Message Retention in Human-Robot Interaction , 2013, Int. J. Soc. Robotics.

[12]  G. Beattie,et al.  An exploration of the other side of semantic communication: How the spontaneous movements of the human hand add crucial meaning to narrative , 2011 .

[13]  Nick Campbell,et al.  Investigating the use of Non-verbal Cues in Human-Robot Interaction with a Nao robot , 2012, 2012 IEEE 3rd International Conference on Cognitive Infocommunications (CogInfoCom).

[14]  Max Q.-H. Meng,et al.  Designing gestures with semantic meanings for humanoid robot , 2012, 2012 IEEE International Conference on Robotics and Biomimetics (ROBIO).

[15]  Pierre Blazevic,et al.  Mechatronic design of NAO humanoid , 2009, 2009 IEEE International Conference on Robotics and Automation.

[16]  D. McNeill,et al.  Speech-gesture mismatches: Evidence for one underlying representation of linguistic and nonlinguistic information , 1998 .

[17]  J. Fleiss,et al.  Intraclass correlations: uses in assessing rater reliability. , 1979, Psychological bulletin.

[18]  Allison Sauppé,et al.  Robot Deictics: How Gesture and Context Shape Referential Communication , 2014, 2014 9th ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[19]  Jonathan W. Peirce,et al.  PsychoPy—Psychophysics software in Python , 2007, Journal of Neuroscience Methods.

[20]  Anthony G. Pipe,et al.  Cooperative tabletop working for humans and humanoid robots: Group interaction with an avatar , 2013, 2013 IEEE International Conference on Robotics and Automation.

[21]  M. Yasumura,et al.  What Do Robot Gestures Tell Us? Emotions and Messages in Simple Robotic Movement , 2008 .

[22]  T. Trabasso,et al.  Offering a Hand to Pragmatic Understanding: The Role of Speech and Gesture in Comprehension and Memory , 1999 .

[23]  John-John Cabibihan,et al.  Human-Recognizable Robotic Gestures , 2012, IEEE Transactions on Autonomous Mental Development.

[24]  Bilge Mutlu,et al.  Designing persuasive robots: How robots might persuade people using vocal and nonverbal cues , 2012, 2012 7th ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[25]  Gary Morgan,et al.  Iconic gesture and speech integration in younger and older adults , 2011 .

[26]  Heather Shovelton,et al.  Why the spontaneous images created by the hands during talk can help make TV advertisements more effective. , 2005, British journal of psychology.

[27]  Sriram Subramanian,et al.  The effects of robot-performed co-verbal gesture on listener behaviour , 2011, 2011 11th IEEE-RAS International Conference on Humanoid Robots.

[28]  M. Studdert-Kennedy Hand and Mind: What Gestures Reveal About Thought. , 1994 .

[29]  Tetsuo Ono,et al.  Embodied communications between humans and robots emerging from entrained gestures , 2003, Proceedings 2003 IEEE International Symposium on Computational Intelligence in Robotics and Automation. Computational Intelligence in Robotics and Automation for the New Millennium (Cat. No.03EX694).

[30]  Adriana Tapus,et al.  A model for synthesizing a combined verbal and nonverbal behavior based on personality traits in human-robot interaction , 2013, 2013 8th ACM/IEEE International Conference on Human-Robot Interaction (HRI).