The effects of robot-performed co-verbal gesture on listener behaviour

Co-verbal gestures, the spontaneous gestures that accompany human speech, form an integral part of human communications; they have been shown to have a variety of beneficial effects on listener behaviour. Therefore, we suggest that a humanoid robot, which aims to communicate effectively with human users, should gesture in a human-like way, and thus engender similar beneficial effects on users. In order to investigate whether robot-performed co-verbal gestures do produce these effects, and are thus worthwhile for a communicative robot, we have conducted two user studies. In the first study we investigated whether users paid attention to our humanoid robot for longer when it performed co-verbal gestures, than when it performed small arm movements unrelated to the speech. Our findings confirmed our expectations, as there was a very significant difference in the length of time that users paid attention between the two conditions. In the second user study we investigated whether gestures performed during speech improved user memory of facts accompanied by gestures and whether they were linked in memory to the speech they accompanied. An observable affect on the speed and certainty of recall was found. We consider these observations of normative responses to the gestures performed, to be an indication of the value of co-verbal gesture for a communicative humanoid robot, and an objective measure of the success of our gesturing method.

[1]  Takayuki Kanda,et al.  Footing in human-robot conversations: How robots might shape participant roles using gaze cues , 2009, 2009 4th ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[2]  T. Flash,et al.  The coordination of arm movements: an experimentally confirmed mathematical model , 1985, The Journal of neuroscience : the official journal of the Society for Neuroscience.

[3]  R Brookmeyer,et al.  Reliability of the Blessed Telephone Information-Memory-Concentration Test , 1995, Journal of geriatric psychiatry and neurology.

[4]  Emma Loveman,et al.  A systematic investigation of same and cross modality priming using written and spoken responses , 2002, Memory.

[5]  Joyce Yue Chai,et al.  Between linguistic attention and gaze fixations inmultimodal conversational interfaces , 2009, ICMI-MLMI '09.

[6]  Candace L. Sidner,et al.  Recognizing engagement in human-robot interaction , 2010, HRI 2010.

[7]  D. Crystal,et al.  Intonation and Grammar in British English , 1967 .

[8]  Sriram Subramanian,et al.  Beat gesture generation rules for human-robot interaction , 2009, RO-MAN 2009 - The 18th IEEE International Symposium on Robot and Human Interactive Communication.

[9]  R. B. Church,et al.  The role of gesture in memory and social communication , 2007 .

[10]  Steve Mandel,et al.  Effective Presentation Skills , 1987 .

[11]  H. Ishiguro,et al.  A Model of Embodied Communications with Gestures between Human and Robots , 2001 .

[12]  A. Kendon Gesture: Visible Action as Utterance , 2004 .

[13]  Sven Behnke,et al.  Fritz - A Humanoid Communication Robot , 2007, RO-MAN 2007 - The 16th IEEE International Symposium on Robot and Human Interactive Communication.

[14]  D. Master,et al.  Speed of recall in relation to affective tone and intensity of experience , 1983, Psychological Medicine.

[15]  Candace L. Sidner,et al.  Explorations in engagement for humans and robots , 2005, Artif. Intell..

[16]  J. Cassell,et al.  Intersubjectivity in human-agent interaction , 2007 .

[17]  C. Creider Hand and Mind: What Gestures Reveal about Thought , 1994 .

[18]  Sriram Subramanian,et al.  Conversational gestures in human-robot interaction , 2009, 2009 IEEE International Conference on Systems, Man and Cybernetics.

[19]  S B Fawcett,et al.  Training public-speaking behavior: an experimental analysis and social validation. , 1975, Journal of applied behavior analysis.

[20]  S. Goldin-Meadow,et al.  The role of gesture in communication and thinking , 1999, Trends in Cognitive Sciences.

[21]  Y Uno,et al.  Quantitative examinations of internal representations for arm trajectory planning: minimum commanded torque change model. , 1999, Journal of neurophysiology.

[22]  Yukiko I. Nakano,et al.  Estimating user's engagement from eye-gaze behaviors in human-agent conversations , 2010, IUI '10.

[23]  Andrea Lockerd Thomaz,et al.  Effects of nonverbal communication on efficiency and robustness in human-robot teamwork , 2005, 2005 IEEE/RSJ International Conference on Intelligent Robots and Systems.