Using a natural language and gesture interface for unmanned vehicles

Unmanned vehicles, such as mobile robots, must exhibit adjustable autonomy. They must be able to be self-sufficient when the situation warrants; however, as they interact with each other and with humans, they must exhibit an ability to dynamically adjust their independence or dependence as co-operative agents attempting to achieve some goal. This is what we mean by adjustable autonomy. We have been investigating various modes of communication that enhance a robot's capability to work interactively with other robots and with humans. Specifically, we have been investigating how natural language and gesture can provide a user- friendly interface to mobile robots. We have extended this initial work to include semantic and pragmatic procedures that allow humans and robots to act co-operatively, based on whether or not goals have been achieved by the various agents in the interaction. By processing commands that are either spoken or initiated by clicking buttons on a Personal Digital Assistant and by gesturing either naturally or symbolically, we are tracking the various goals of the interaction, the agent involved in the interaction, and whether or not the goal has been achieved. The various agents involved in achieving the goals are each aware of their own and others' goals and what goals have been stated or accomplished so that eventually any member of the group, be it robot or a human, if necessary, can interact with the other members to achieve the stated goals of a mission.

[1]  Sarit Kraus,et al.  Planning and Acting Together , 1999, AI Mag..

[2]  Alan C. Schultz,et al.  Integrating natural language and gesture in a robotics domain , 1998, Proceedings of the 1998 IEEE International Symposium on Intelligent Control (ISIC) held jointly with IEEE International Symposium on Computational Intelligence in Robotics and Automation (CIRA) Intell.

[3]  Alan C. Schultz,et al.  Towards Seamless Integration in a Multi-modal Interface , 2000 .

[4]  Kenneth Wauchope,et al.  Eucalyptus: Integrating Natural Language Input with a Graphical User Interface , 1994 .

[5]  Candace L. Sidner,et al.  Attention, Intentions, and the Structure of Discourse , 1986, CL.

[6]  Alan C. Schultz,et al.  Goal tracking in a natural language interface: towards achieving adjustable autonomy , 1999, Proceedings 1999 IEEE International Symposium on Computational Intelligence in Robotics and Automation. CIRA'99 (Cat. No.99EX375).

[7]  Alan C. Schultz,et al.  Integrating Exploration, Localization, Navigation and Planning with a Common Representation , 1999, Auton. Robots.

[8]  Martha E. Pollack,et al.  There's More to Life than Making Plans: Plan Management in Dynamic, Multiagent Environments , 1999, AI Mag..

[9]  Martha E. Pollack,et al.  Towards Focused Plan Monitoring: A Technique and an Application to Mobile Robots , 1999, Proceedings 1999 IEEE International Symposium on Computational Intelligence in Robotics and Automation. CIRA'99 (Cat. No.99EX375).