A friendly gesture: Investigating the effect of multimodal robot behavior in human-robot interaction

Gesture is an important feature of social interaction, frequently used by human speakers to illustrate what speech alone cannot provide, e.g. to convey referential, spatial or iconic information. Accordingly, humanoid robots that are intended to engage in natural human-robot interaction should produce speech-accompanying gestures for comprehensible and believable behavior. But how does a robot's non-verbal behavior influence human evaluation of communication quality and the robot itself? To address this research question we conducted two experimental studies. Using the Honda humanoid robot we investigated how humans perceive various gestural patterns performed by the robot as they interact in a situational context. Our findings suggest that the robot is evaluated more positively when non-verbal behaviors such as hand and arm gestures are displayed along with speech. These findings were found to be enhanced when the participants were explicitly requested to direct their attention towards the robot during the interaction.

[1]  Stefan Kopp,et al.  Towards an integrated model of speech and gesture production for multi-modal robot behavior , 2010, 19th International Symposium in Robot and Human Interactive Communication.

[2]  Kenneth Holmqvist,et al.  Keeping an eye on gestures: Visual perception of gestures in face-to-face communication , 1999 .

[3]  Adam Kendon Gesture: Visible Action as Utterance: ‘Gesture’ and ‘sign’ on common ground , 2004 .

[4]  Takayuki Kanda,et al.  Providing route directions: Design of robot's utterance, gesture, and timing , 2009, 2009 4th ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[5]  Marc Schröder,et al.  The German Text-to-Speech Synthesis System MARY: A Tool for Research, Development and Teaching , 2003, Int. J. Speech Technol..

[6]  S. Goldin-Meadow,et al.  The role of gesture in communication and thinking , 1999, Trends in Cognitive Sciences.

[7]  Stefan Kopp,et al.  The Effects of an Embodied Conversational Agent's Nonverbal Behavior on User's Evaluation and Behavioral Mimicry , 2007, IVA.

[8]  A. Takanishi,et al.  Various emotional expressions with emotion expression humanoid robot WE-4RII , 2004, IEEE Conference on Robotics and Automation, 2004. TExCRA Technical Exhibition Based..

[9]  Candace L. Sidner,et al.  Where to look: a study of human-robot engagement , 2004, IUI '04.

[10]  Aude Billard,et al.  Learning of Gestures by Imitation in a Humanoid Robot , 2007 .

[11]  C. Creider Hand and Mind: What Gestures Reveal about Thought , 1994 .

[12]  Stefan Schaal,et al.  Robot Programming by Demonstration , 2009, Springer Handbook of Robotics.

[13]  Stefan Kopp,et al.  Synthesizing multimodal utterances for conversational agents , 2004, Comput. Animat. Virtual Worlds.

[14]  Takayuki Kanda,et al.  Natural deictic communication with humanoid robots , 2007, 2007 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[15]  Autumn B. Hostetter,et al.  Visible embodiment: Gestures as simulated action , 2008, Psychonomic bulletin & review.

[16]  Pengcheng Luo,et al.  Synchronized gesture and speech production for humanoid robots , 2010, 2010 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[17]  Stefan Kopp,et al.  Individualized Gesturing Outperforms Average Gesturing - Evaluating Gesture Production in Virtual Humans , 2010, IVA.

[18]  A. Kendon Gesture: Visible Action as Utterance , 2004 .