Iconic Gestures for Robot Avatars, Recognition and Integration with Speech

Co-verbal gestures are an important part of human communication, improving its efficiency and efficacy for information conveyance. One possible means by which such multi-modal communication might be realized remotely is through the use of a tele-operated humanoid robot avatar. Such avatars have been previously shown to enhance social presence and operator salience. We present a motion tracking based tele-operation system for the NAO robot platform that allows direct transmission of speech and gestures produced by the operator. To assess the capabilities of this system for transmitting multi-modal communication, we have conducted a user study that investigated if robot-produced iconic gestures are comprehensible, and are integrated with speech. Robot performed gesture outcomes were compared directly to those for gestures produced by a human actor, using a within participant experimental design. We show that iconic gestures produced by a tele-operated robot are understood by participants when presented alone, almost as well as when produced by a human. More importantly, we show that gestures are integrated with speech when presented as part of a multi-modal communication equally well for human and robot performances.

[1]  P. Ekman Movements with Precise Meanings , 1976 .

[2]  J. Fleiss,et al.  Intraclass correlations: uses in assessing rater reliability. , 1979, Psychological bulletin.

[3]  M. Studdert-Kennedy Hand and Mind: What Gestures Reveal About Thought. , 1994 .

[4]  D. McNeill,et al.  Speech-gesture mismatches: Evidence for one underlying representation of linguistic and nonlinguistic information , 1998 .

[5]  T. Trabasso,et al.  Offering a Hand to Pragmatic Understanding: The Role of Speech and Gesture in Comprehension and Memory , 1999 .

[6]  Tetsuo Ono,et al.  Embodied communications between humans and robots emerging from entrained gestures , 2003, Proceedings 2003 IEEE International Symposium on Computational Intelligence in Robotics and Automation. Computational Intelligence in Robotics and Automation for the New Millennium (Cat. No.03EX694).

[7]  Y. Paulignan,et al.  An Interference Effect of Observed Biological Movement on Action , 2003, Current Biology.

[8]  A. Kendon Gesture: Visible Action as Utterance , 2004 .

[9]  Heather Shovelton,et al.  Why the spontaneous images created by the hands during talk can help make TV advertisements more effective. , 2005, British journal of psychology.

[10]  Henrik Schiøler,et al.  Sociable Robots Through Self-Maintained Energy , 2006 .

[11]  Giorgio Metta,et al.  YARP: Yet Another Robot Platform , 2006 .

[12]  Jonathan W. Peirce,et al.  PsychoPy—Psychophysics software in Python , 2007, Journal of Neuroscience Methods.

[13]  Christian Keysers,et al.  The anthropomorphic brain: The mirror neuron system responds to human and robotic actions , 2007, NeuroImage.

[14]  Susan R. Fussell,et al.  Comparing a computer agent with a humanoid robot , 2007, 2007 2nd ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[15]  Samuel Tardieu,et al.  The Urbi Universal Platform for Robotics , 2008 .

[16]  Pierre Blazevic,et al.  Mechatronic design of NAO humanoid , 2009, 2009 IEEE International Conference on Robotics and Automation.

[17]  Morgan Quigley,et al.  ROS: an open-source Robot Operating System , 2009, ICRA 2009.

[18]  Peter Robinson,et al.  Cooperative gestures: Effective signaling for humanoid robots , 2010, 2010 5th ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[19]  James Bartolotti,et al.  Integrating Speech and Iconic Gestures in a Stroop-like Task: Evidence for Automatic Processing , 2010, Journal of Cognitive Neuroscience.

[20]  Cynthia Breazeal,et al.  MeBot: A robotic platform for socially embodied telepresence , 2010, 2010 5th ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[21]  Nicole C. Krämer,et al.  "It doesn't matter what you are!" Explaining social effects of agents and avatars , 2010, Comput. Hum. Behav..

[22]  Cynthia Breazeal,et al.  MeBot: a robotic platform for socially embodied presence , 2010, HRI.

[23]  Gary Morgan,et al.  Iconic gesture and speech integration in younger and older adults , 2011 .

[24]  G. Beattie,et al.  An exploration of the other side of semantic communication: How the spontaneous movements of the human hand add crucial meaning to narrative , 2011 .

[25]  Sriram Subramanian,et al.  The effects of robot-performed co-verbal gesture on listener behaviour , 2011, 2011 11th IEEE-RAS International Conference on Humanoid Robots.

[26]  Autumn B. Hostetter,et al.  When do gestures communicate? A meta-analysis. , 2011, Psychological bulletin.

[27]  Nick Campbell,et al.  Investigating the use of Non-verbal Cues in Human-Robot Interaction with a Nao robot , 2012, 2012 IEEE 3rd International Conference on Cognitive Infocommunications (CogInfoCom).

[28]  Max Q.-H. Meng,et al.  Designing gestures with semantic meanings for humanoid robot , 2012, 2012 IEEE International Conference on Robotics and Biomimetics (ROBIO).

[29]  John-John Cabibihan,et al.  Human-Recognizable Robotic Gestures , 2012, IEEE Transactions on Autonomous Mental Development.

[30]  Zhengchen Zhang,et al.  Telerobotic Pointing Gestures Shape Human Spatial Cognition , 2012, Int. J. Soc. Robotics.

[31]  Bilge Mutlu,et al.  Designing persuasive robots: How robots might persuade people using vocal and nonverbal cues , 2012, 2012 7th ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[32]  Cory J. Hayes,et al.  Automatic processing of irrelevant co-speech gestures with human but not robot actors , 2013, 2013 8th ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[33]  Anthony G. Pipe,et al.  Cooperative tabletop working for humans and humanoid robots: Group interaction with an avatar , 2013, 2013 IEEE International Conference on Robotics and Automation.

[34]  Elena Torta,et al.  Effects of Eye Contact and Iconic Gestures on Message Retention in Human-Robot Interaction , 2013, Int. J. Soc. Robotics.

[35]  Guillaume Gibert,et al.  What makes human so different ? Analysis of human-humanoid robot interaction with a super Wizard of Oz platform , 2013 .

[36]  Adriana Tapus,et al.  A model for synthesizing a combined verbal and nonverbal behavior based on personality traits in human-robot interaction , 2013, 2013 8th ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[37]  Lin Wang,et al.  The role of beat gesture and pitch accent in semantic processing: An ERP study , 2013, Neuropsychologia.

[38]  Stefan Kopp,et al.  To Err is Human(-like): Effects of Robot Gesture on Perceived Anthropomorphism and Likability , 2013, International Journal of Social Robotics.

[39]  Allison Sauppé,et al.  Robot Deictics: How Gesture and Context Shape Referential Communication , 2014, 2014 9th ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[40]  Bilge Mutlu,et al.  Learning-Based Modeling of Multimodal Behaviors for Humanlike Robots , 2014, 2014 9th ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[41]  Ana-Maria Cebolla,et al.  Physiological modules for generating discrete and rhythmic movements: action identification by a dynamic recurrent neural network , 2014, Front. Comput. Neurosci..

[42]  Ana-Maria Cebolla,et al.  Physiological modules for generating discrete and rhythmic movements: component analysis of EMG signals , 2015, Front. Comput. Neurosci..

[43]  Paul Bremner,et al.  Speech and Gesture Emphasis Effects For Robotic and Human Communicators - a Direct Comparison , 2015, 2015 10th ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[44]  Michael Andric,et al.  The neural basis of hand gesture comprehension: A meta-analysis of functional magnetic resonance imaging studies , 2015, Neuroscience & Biobehavioral Reviews.

[45]  Paul Bremner,et al.  Efficiency of speech and iconic gesture integration for robotic and human communicators - a direct comparison , 2015, 2015 IEEE International Conference on Robotics and Automation (ICRA).

[46]  Kazuaki Tanaka,et al.  Physical Embodiment Can Produce Robot Operator’s Pseudo Presence , 2015, Front. ICT.