Mutual entrainment: Implicit elicitation of human gestures by robot speech

Social robots that provide services to humans in real environments have been developed in recent years. Such a robot should appropriately recognize its users' orders through human-like communications because of user-friendliness. However, their styles of communicating are too diverse to achieve this goal. If the robot could shape their styles, its recognition ability would be improved. An entrainment, which is a phenomenon where human's behavior is synchronized with robot's behavior, can be useful for this shaping. Previous studies have reported the entrainment occurring in the same modality, but they have given little attention to entrainment across different modalities (e.g., speech and gestures). We need to consider this cross-modal effect because human-robot interaction is inherently multi-modal. In this paper, we defined “mutual entrainment” as the entrainment across different modalities and investigated the effect of it through a laboratory experiment. We evaluate how the frequency of human pointing gestures varies with the amount of information in robot speech, and as a result, we found that the gesture frequency increased as the amount of information decreased. The results suggest that smoother human-robot communications can be achieved by shaping human behavior through mutual entrainment.

[1]  A. Scheflen THE SIGNIFICANCE OF POSTURE IN COMMUNICATION SYSTEMS. , 1964, Psychiatry.

[2]  Joseph E. Charny Psychosomatic Manifestations of Rapport in Psychotherapy , 1966, Psychosomatic medicine.

[3]  W. S. Condon,et al.  A segmentation of behavior , 1967 .

[4]  A. Kendon Movement coordination in social interaction: some examples described. , 1970, Acta psychologica.

[5]  J. Matarazzo,et al.  The interview; research on its anatomy and structure , 1972 .

[6]  W. S. Condon,et al.  Neonate Movement Is Synchronized with Adult Speech: Interactional Participation and Language Acquisition , 1974, Science.

[7]  D. Morris Manwatching : a field guide to human behaviour , 1977 .

[8]  S. Garrod,et al.  Saying what you mean in dialogue: A study in conceptual and semantic co-ordination , 1987, Cognition.

[9]  D. McNeill Psycholinguistics: A New Approach , 1987 .

[10]  C. Moore,et al.  Joint attention : its origins and role in development , 1995 .

[11]  Susan E. Brennan,et al.  LEXICAL ENTRAINMENT IN SPONTANEOUS DIALOG , 1996 .

[12]  Joakim Gustafson,et al.  How do system questions influence lexical choices in user answers? , 1997, EUROSPEECH.

[13]  Wolfram Burgard,et al.  The Interactive Museum Tour-Guide Robot , 1998, AAAI/IAAI.

[14]  Brian Scassellati,et al.  A Context-Dependent Attention System for a Social Robot , 1999, IJCAI.

[15]  R. Brooks,et al.  The cog project: building a humanoid robot , 1999 .

[16]  H. Ishiguro,et al.  A Model of Embodied Communications with Gestures between Human and Robots , 2001 .

[17]  K. Nakadai,et al.  Real-Time Auditory and Visual Multiple-Object Tracking for Robots , 2001, IJCAI 2001.

[18]  Tomio Watanabe,et al.  InterRobot: speech-driven embodied interaction robot , 2001, Adv. Robotics.

[19]  Tetsuo Ono,et al.  Robovie: an interactive humanoid robot , 2001 .

[20]  D. Fox,et al.  People Tracking with Anonymous and ID-Sensors Using Rao-Blackwellised Particle Filters , 2003, IJCAI.

[21]  Tetsuo Ono,et al.  Physical relation and expression: joint attention for human-robot interaction , 2003, IEEE Trans. Ind. Electron..

[22]  Tetsuo Ono,et al.  Cooperative embodied communication emerged by interactive humanoid robots , 2004 .

[23]  Ronald Rosenfeld,et al.  Shaping spoken input in user-initiative systems , 2004, INTERSPEECH.

[24]  Takayuki Kanda,et al.  An approach to integrating an interactive guide robot with ubiquitous sensors , 2004, 2004 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS) (IEEE Cat. No.04CH37566).

[25]  R. Porzel How Entrainment Increases Dialogical Effectiveness , 2006 .

[26]  Takayuki Kanda,et al.  Interactive Humanoid Robots for a Science Museum , 2006, IEEE Intelligent Systems.

[27]  Norihiro Hagita,et al.  User specification method and humanoid confirmation behavior , 2007, 2007 7th IEEE-RAS International Conference on Humanoid Robots.

[28]  Tatsuya Nomura,et al.  Analysis of People Trajectories with Ubiquitous Sensors in a Science Museum , 2007, Proceedings 2007 IEEE International Conference on Robotics and Automation.

[29]  Norihiro Hagita,et al.  Lexical entrainment in human-robot interaction: Can robots entrain human vocabulary? , 2009, 2009 IEEE/RSJ International Conference on Intelligent Robots and Systems.