Towards a Framework for Integrated Natural Language Processing Architectures for Social Robots

Current social robots lack the natural language capacities to be able to interact with humans in natural ways. In this paper, we present results from human experiments intended to isolate spoken interaction types in a search and rescue task and briefly discuss implications for NLP architectures for embodied situated agents.

[1]  H. H. Clark,et al.  Speaking while monitoring addressees for understanding , 2004 .

[2]  Matthias Scheutz,et al.  Speech and action: integration of action and language for mobile robots , 2007, 2007 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[3]  Michael K. Tanenhaus,et al.  Pragmatic effects on reference resolution in a collaborative task: evidence from eye movements , 2004, Cogn. Sci..

[4]  Matthias Scheutz,et al.  Incremental natural language processing for HRI , 2007, 2007 2nd ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[5]  M. Tanenhaus,et al.  Actions and affordances in syntactic ambiguity resolution. , 2004, Journal of experimental psychology. Learning, memory, and cognition.

[6]  J. Shaoul Human Error , 1973, Nature.

[7]  A. Bangerter,et al.  Using Pointing and Describing to Achieve Joint Focus of Attention in Dialogue , 2004, Psychological science.

[8]  Matthias Scheutz,et al.  The utility of affect expression in natural language interactions in joint human-robot tasks , 2006, HRI '06.

[9]  Matthias Scheutz,et al.  A real-time robotic model of human reference resolution using visual constraints , 2004, Connect. Sci..