Interaction with Robot Assistants : Commanding

Giving advice to a mobile robot assistant still requires classical user interfaces. A more intuitive way of commanding is achieved by verbal or gesture commands. In this article, we present new approaches and enhancements for established methods that are in use in our laboratory. Our aim is to interact with a robot using natural and direct communication techniques, and through this means coping with complex robot programs. Using skin color segmentation algorithms for tracking a user’s hand, Fourier Descriptors of it’s outline are computed in order to map hand shapes to gesture classes. Recognized gesture types and manipulable objects as well as verbal commands are fed into an event manager that selects and starts an appropriate reaction. This architecture has proven it’s applicability to human-machine interaction in first experiments and will be expanded in the near future.

[1]  Masanobu Yamamoto,et al.  Human motion analysis based on a robot arm model , 1991, Proceedings. 1991 IEEE Computer Society Conference on Computer Vision and Pattern Recognition.

[2]  U. Rembold,et al.  KANTRA-human-machine interaction for intelligent robots using natural language , 1994, Proceedings of 1994 3rd IEEE International Workshop on Robot and Human Communication.

[3]  Takeo Kanade,et al.  Visual Tracking of High DOF Articulated Structures: an Application to Human Hand Tracking , 1994, ECCV.

[4]  Larry S. Davis,et al.  Towards 3-D model-based tracking and recognition of human movement: a multi-view approach , 1995 .

[5]  Tomas Uhlin,et al.  Closing the loop: detection and pursuit of a moving object by a moving observer , 1996, Image Vis. Comput..

[6]  Dorin Comaniciu,et al.  Robust analysis of feature spaces: color image segmentation , 1997, Proceedings of IEEE Computer Society Conference on Computer Vision and Pattern Recognition.

[7]  José Santos-Victor,et al.  Robust visual tracking by an active observer , 1997, Proceedings of the 1997 IEEE/RSJ International Conference on Intelligent Robot and Systems. Innovative Robotics for Real-World Applications. IROS '97.

[8]  Jochen Triesch,et al.  Robotic Gesture Recognition , 1997, Gesture Workshop.

[9]  John Fry,et al.  Natural dialogue with the Jijo-2 office robot , 1998, Proceedings. 1998 IEEE/RSJ International Conference on Intelligent Robots and Systems. Innovations in Theory, Practice and Applications (Cat. No.98CH36190).

[10]  Michael Isard,et al.  Active Contours , 2000, Springer London.

[11]  Yoshiaki Shirai,et al.  Helping computer vision by verbal and nonverbal communication , 1998, Proceedings. Fourteenth International Conference on Pattern Recognition (Cat. No.98EX170).

[12]  Danica Kragic,et al.  A person following behaviour for a mobile robot , 1999, Proceedings 1999 IEEE International Conference on Robotics and Automation (Cat. No.99CH36288C).

[13]  A. Steinhage,et al.  Learning by Doing : A Dynamic Architecture for Generating Adaptive Behavioral Sequences , 2000 .

[14]  R. Dillmann,et al.  TEACHING SERVICE ROBOTS COMPLEX TASKS : PROGRAMMING BY DEMONSTRATION FOR WORKSHOP AND HOUSEHOLD ENVIRONMENTS , 2001 .

[15]  Magdalena D. Bugajska,et al.  Building a Multimodal Human-Robot Interface , 2001, IEEE Intell. Syst..