Lexical Entrainment in Human Robot Interaction

This paper reveals that lexical entrainment, in which a person tends to change her verbal expressions to match those said by her addressee, occurs in interactions between people and a robot when they refer to an object in a shared physical space. Many studies argue that lexical entrainment is crucial for understanding the principle of human dialogue and the development of the natural language interfaces of artificial media. However, few studies of it exist in human robot interaction in which humans and robot share a physical space. If lexical entrainment occurs in situations where a physical space is shared with a robot, such findings will contribute to the development of natural language interfaces with social robots. We designed experimental tests in which participants refer to an object and a robot confirms it and measured the extent to which the participants repeated the same verbal expressions said by the robot. Our subjects tended to adopt both the same verbal expressions and lexical categories as the robot.

[1]  Joakim Gustafson,et al.  How do system questions influence lexical choices in user answers? , 1997, EUROSPEECH.

[2]  Hideaki Kuzuoka,et al.  Museum guide robot based on sociological interaction analysis , 2007, CHI.

[3]  SUSAN E. BRENNAN,et al.  Conversation with and through computers , 1991, User Modeling and User-Adapted Interaction.

[4]  S. Garrod,et al.  Saying what you mean in dialogue: A study in conceptual and semantic co-ordination , 1987, Cognition.

[5]  M. Pickering,et al.  Toward a mechanistic psychology of dialogue , 2004, Behavioral and Brain Sciences.

[6]  Alexandra A. Cleland,et al.  The use of lexical and syntactic information in language production: Evidence from the priming of noun-phrase structure , 2003 .

[7]  Anders Green,et al.  Involving users in the design of a mobile office robot , 2004, IEEE Transactions on Systems, Man, and Cybernetics, Part C (Applications and Reviews).

[8]  H. H. Clark,et al.  Conceptual pacts and lexical choice in conversation. , 1996, Journal of experimental psychology. Learning, memory, and cognition.

[9]  Clifford Nass,et al.  Syntactic alignment between computers and people: The role of belief about mental states , 2003 .

[10]  T. Chartrand,et al.  The chameleon effect: the perception-behavior link and social interaction. , 1999, Journal of personality and social psychology.

[11]  Takayuki Kanda,et al.  Humanlike conversation with gestures and verbal cues based on a three-layer attention-drawing model , 2006, Connect. Sci..

[12]  Susan E. Brennan,et al.  LEXICAL ENTRAINMENT IN SPONTANEOUS DIALOG , 1996 .

[13]  Takayuki Kanda,et al.  Interactive Humanoid Robots for a Science Museum , 2007, IEEE Intell. Syst..

[14]  M. Pickering,et al.  Linguistic alignment between people and computers , 2010 .

[15]  Jiang Hu,et al.  Adaptive language behavior in HCI: how expectations and beliefs about a system affect users' word choice , 2006, CHI.

[16]  Janet Beavin Bavelas,et al.  Linguistic influences on gesture’s form , 2005 .

[17]  H. Ishiguro,et al.  A Model of Embodied Communications with Gestures between Human and Robots , 2001 .

[18]  Ronald Rosenfeld,et al.  Shaping spoken input in user-initiative systems , 2004, INTERSPEECH.

[19]  Elena Torta,et al.  Effects of Eye Contact and Iconic Gestures on Message Retention in Human-Robot Interaction , 2013, Int. J. Soc. Robotics.

[20]  Norihiro Hagita,et al.  Investigating Entrainment of People’s Pointing Gestures by Robot’s Gestures Using a WOZ Method , 2011, Int. J. Soc. Robotics.

[21]  Sriram Subramanian,et al.  Talking about tactile experiences , 2013, CHI.

[22]  Susan T. Dumais,et al.  The vocabulary problem in human-system communication , 1987, CACM.

[23]  Haizhou Li,et al.  Making Social Robots More Attractive: The Effects of Voice Pitch, Humor and Empathy , 2013, Int. J. Soc. Robotics.

[24]  Judith Holler,et al.  Pragmatic aspects of representational gestures: Do speakers use them to clarify verbal ambiguity for the listener? , 2003 .

[25]  Norihiro Hagita,et al.  User specification method and humanoid confirmation behavior , 2007, 2007 7th IEEE-RAS International Conference on Humanoid Robots.

[26]  Selma Sabanovic,et al.  Deriving Minimal Features for Human-Like Facial Expressions in Robotic Faces , 2014, International Journal of Social Robotics.

[27]  H. H. Clark,et al.  Referring as a collaborative process , 1986, Cognition.

[28]  Takayuki Kanda,et al.  Field Trial of a Networked Robot at a Train Station , 2011, Int. J. Soc. Robotics.

[29]  A. Garnham,et al.  The role of conversational hand gestures in a narrative task , 2007 .

[30]  Robbert-Jan Beun,et al.  Object reference in a shared domain of conversation , 1998 .

[31]  Seiji Yamada,et al.  RobotMeme - A Proposal of Human-Robot Mimetic Mutual Adaptation- , 2007, RO-MAN 2007 - The 16th IEEE International Symposium on Robot and Human Interactive Communication.

[32]  C. Moore,et al.  Joint attention : its origins and role in development , 1995 .

[33]  Johanna D. Moore,et al.  Predicting Success in Dialogue , 2007, ACL.

[34]  Stephanie Kelter,et al.  Surface form and memory in question answering , 1982, Cognitive Psychology.

[35]  Alexandra A. Cleland,et al.  Syntactic co-ordination in dialogue , 2000, Cognition.