Lexical entrainment in human-robot interaction: Can robots entrain human vocabulary?

A communication robot must recognize a referred-to object to support us in daily life. However, using our wide human vocabulary, we often refer to objects in terms that are incomprehensible to the robot. This paper focuses on lexical entrainment to solve this problem. Lexical entrainment is the phenomenon of people tending to adopt the terms of their interlocutor. While this has been well studied in human-computer interaction, few published papers have approached it in human-robot interaction. To investigate how lexical entrainment occurs in human-robot interaction, we conduct experiments where people instruct the robot to move objects. Our results show that two types of lexical entrainment occur in human-robot interaction. We also discuss the effects of the state of objects on lexical entrainment. Finally, we developed a test bed system for recognizing a referred-to object on the basis of knowledge from our experiments.

[1]  Tomio Watanabe,et al.  InterRobot: speech-driven embodied interaction robot , 2001, Adv. Robotics.

[2]  Joakim Gustafson,et al.  How do system questions influence lexical choices in user answers? , 1997, EUROSPEECH.

[3]  Susan T. Dumais,et al.  The vocabulary problem in human-system communication , 1987, CACM.

[4]  Jannik Fritsch,et al.  A multi-modal object attention system for a mobile robot , 2005, 2005 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[5]  R.J. Fontana,et al.  Ultra-wideband precision asset location system , 2002, 2002 IEEE Conference on Ultra Wideband Systems and Technologies (IEEE Cat. No.02EX580).

[6]  Andrea Lockerd Thomaz,et al.  Effects of nonverbal communication on efficiency and robustness in human-robot teamwork , 2005, 2005 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[7]  Susan E. Brennan,et al.  LEXICAL ENTRAINMENT IN SPONTANEOUS DIALOG , 1996 .

[8]  S. Garrod,et al.  Saying what you mean in dialogue: A study in conceptual and semantic co-ordination , 1987, Cognition.

[9]  Tetsuo Ono,et al.  Physical relation and expression: joint attention for human-robot interaction , 2003, IEEE Trans. Ind. Electron..

[10]  H. H. Clark,et al.  Conceptual pacts and lexical choice in conversation. , 1996, Journal of experimental psychology. Learning, memory, and cognition.

[11]  Tetsuo Ono,et al.  Robovie: an interactive humanoid robot , 2001 .

[12]  A. Scheflen THE SIGNIFICANCE OF POSTURE IN COMMUNICATION SYSTEMS. , 1964, Psychiatry.

[13]  M. Tuchler,et al.  Location accuracy of an UWB localization system in a multi-path environment , 2005, 2005 IEEE International Conference on Ultra-Wideband.

[14]  K. Nakadai,et al.  Real-Time Auditory and Visual Multiple-Object Tracking for Robots , 2001, IJCAI 2001.

[15]  Yoshinori Kuno,et al.  Understanding inexplicit utterances using vision for helper robots , 2004, Proceedings of the 17th International Conference on Pattern Recognition, 2004. ICPR 2004..

[16]  Fang Chen,et al.  Workshop on effective multimodal dialogue interfaces , 2006, IUI '06.

[17]  Futoshi Naya,et al.  Differences in effect of robot and screen agent recommendations on human decision-making , 2005, Int. J. Hum. Comput. Stud..

[18]  Jean Rouat,et al.  Making a robot recognize three simultaneous sentences in real-time , 2005, 2005 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[19]  W. S. Condon,et al.  Neonate Movement Is Synchronized with Adult Speech: Interactional Participation and Language Acquisition , 1974, Science.

[20]  Tetsuo Ono,et al.  Development and evaluation of interactive humanoid robots , 2004, Proceedings of the IEEE.

[21]  Brian Scassellati,et al.  Investigating models of social development using a humanoid robot , 2003, Proceedings of the International Joint Conference on Neural Networks, 2003..

[22]  R. Porzel How Entrainment Increases Dialogical Effectiveness , 2006 .

[23]  A. Kendon Movement coordination in social interaction: some examples described. , 1970, Acta psychologica.

[24]  Hiroaki Kitano,et al.  Real-Time Auditory and Visual Multiple-Object Tracking for Humanoids , 2001, IJCAI.

[25]  Tatsuya Nomura,et al.  Analysis of People Trajectories with Ubiquitous Sensors in a Science Museum , 2007, Proceedings 2007 IEEE International Conference on Robotics and Automation.

[26]  H. Ishiguro,et al.  A Model of Embodied Communications with Gestures between Human and Robots , 2001 .

[27]  Ronald Rosenfeld,et al.  Shaping spoken input in user-initiative systems , 2004, INTERSPEECH.

[28]  Masayuki Inaba,et al.  PEXIS: Probabilistic experience representation based adaptive interaction system for personal robots , 2004, Systems and Computers in Japan.

[29]  Cynthia Breazeal,et al.  Regulation and Entrainment in Human—Robot Interaction , 2000, Int. J. Robotics Res..

[30]  Satoshi Nakamura,et al.  A Robust Speech Recognition System for Communication Robots in Noisy Environments , 2008, IEEE Transactions on Robotics.

[31]  Norihiro Hagita,et al.  User specification method and humanoid confirmation behavior , 2007, 2007 7th IEEE-RAS International Conference on Humanoid Robots.

[32]  Takayuki Kanda,et al.  A tension-moderating mechanism for promoting speech-based human-robot interaction , 2005, 2005 IEEE/RSJ International Conference on Intelligent Robots and Systems.