Interactive multi-modal robot programming

This paper introduces a novel approach to program a robot interactively through a multi-modal interface. The key characteristic of this approach is that the user can provide feedback interactively at any time - during both the programming and the execution phase. The framework takes a three-step approach to the problem: multi-modal recognition, intention interpretation, and prioritized task execution. The multi-modal recognition module translates hand gestures and spontaneous speech into a structured symbolic data stream without abstracting away the user's intent. The intention interpretation module selects the appropriate primitives to generate a task based on the user's input, the system's current state, and robot sensor data. Finally, the prioritized task execution module selects and executes skill primitives based on the system's current state, sensor inputs, and prior tasks. The framework is demonstrated by interactively controlling and programming a vacuum-cleaning robot.

[1]  George Musser Robots That Suck , 2003 .

[2]  Yoji Yamada,et al.  Construction of a human/robot coexistence system based on a model of human will-intention and desire , 1999, Proceedings 1999 IEEE International Conference on Robotics and Automation (Cat. No.99CH36288C).

[3]  Paul E. Rybski,et al.  Interactive task training of a mobile robot through human gesture recognition , 1999, Proceedings 1999 IEEE International Conference on Robotics and Automation (Cat. No.99CH36288C).

[4]  Pradeep K. Khosla,et al.  A human machine interface for distributed virtual laboratories , 1994, IEEE Robotics & Automation Magazine.

[5]  Illah Nourbakhsh,et al.  Path planning for the Cye personal robot , 2000, Proceedings. 2000 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS 2000) (Cat. No.00CH37113).

[6]  Katsushi Ikeuchi,et al.  Toward automatic robot instruction from perception-mapping human grasps to manipulator grasps , 1997, IEEE Trans. Robotics Autom..

[7]  Pradeep K. Khosla,et al.  Manipulation task primitives for composing robot skills , 1997, Proceedings of International Conference on Robotics and Automation.

[8]  Robert J. Anderson,et al.  A telerobot control system for accident response , 2000, Proceedings 2000 ICRA. Millennium Conference. IEEE International Conference on Robotics and Automation. Symposia Proceedings (Cat. No.00CH37065).

[9]  H. Kimura,et al.  Acquiring hand-action models in task and behavior levels by a learning robot through observing human demonstrations , 2000 .

[10]  Michael Vande Weghe,et al.  An architecture for gesture-based control of mobile robots , 1999, Proceedings 1999 IEEE/RSJ International Conference on Intelligent Robots and Systems. Human and Environment Friendly Robots with High Intelligence and Emotional Quotients (Cat. No.99CH36289).

[11]  Alex Pentland,et al.  Understanding purposeful human motion , 2000, Proceedings Fourth IEEE International Conference on Automatic Face and Gesture Recognition (Cat. No. PR00580).

[12]  Alan C. Schultz,et al.  Goal tracking in a natural language interface: towards achieving adjustable autonomy , 1999, Proceedings 1999 IEEE International Symposium on Computational Intelligence in Robotics and Automation. CIRA'99 (Cat. No.99EX375).

[13]  Francis K. H. Quek,et al.  Toward a vision-based hand gesture interface , 1994 .

[14]  Sharon L. Oviatt,et al.  Taming recognition errors with a multimodal interface , 2000, CACM.

[15]  Christiaan J. J. Paredis,et al.  Interactive Multi-Modal Robot Programming , 2004, ISER.

[16]  Guido Bugmann,et al.  Training Personal Robots Using Natural Language Instruction , 2001, IEEE Intell. Syst..

[17]  Terrence Fong,et al.  Novel interfaces for remote driving: gesture, haptic, and PDA , 2001, SPIE Optics East.

[18]  Magdalena D. Bugajska,et al.  Building a Multimodal Human-Robot Interface , 2001, IEEE Intell. Syst..

[19]  Steve Young,et al.  Token passing: a simple conceptual model for connected speech recognition systems , 1989 .

[20]  Pradeep K. Khosla,et al.  Gesture-based programming: a preliminary demonstration , 1999, Proceedings 1999 IEEE International Conference on Robotics and Automation (Cat. No.99CH36288C).

[21]  Katsushi Ikeuchi,et al.  Task-model based human robot cooperation using vision , 1999, Proceedings 1999 IEEE/RSJ International Conference on Intelligent Robots and Systems. Human and Environment Friendly Robots with High Intelligence and Emotional Quotients (Cat. No.99CH36289).

[22]  Katsushi Ikeuchi,et al.  Toward an assembly plan from observation. I. Task recognition with polyhedral objects , 1994, IEEE Trans. Robotics Autom..

[23]  H.,et al.  Token Passing : a Simple Conceptual Model for ConnectedSpeech Recognition , 1989 .