A neural-network based approach for recognition of pose and motion gestures on a mobile robot

Since a variety of changes in both robotic hardware and software suggests that service robots will soon become possible, to find "natural" ways of communication between human and robots is of fundamental importance for the robotic field. The paper describes a gesture-based interface for human-robot interaction, which enables people to instruct robots through easy-to-perform arm gestures. Such gestures might be static pose gestures, which involve only a specific configuration of the person's arm, or they might be dynamic motion gestures, that is, they involve motion (such as waving). Gestures are recognized in real-time at approximate frame rate, using neural networks. A fast, color-based tracking algorithm enables the robot to track and follow a person reliably through office environments with drastically changing lighting conditions. Results are reported in the context of an interactive clean-up task, where a person guides the robot to specific locations that need to be cleaned, and the robot picks up trash which it then delivers to the nearest trash-bin.

[1]  Sebastian Thrun,et al.  Learning Metric-Topological Maps for Indoor Mobile Robot Navigation , 1998, Artif. Intell..

[2]  Trevor Darrell,et al.  Active face tracking and pose estimation in an interactive room , 1996, Proceedings CVPR IEEE Computer Society Conference on Computer Vision and Pattern Recognition.

[3]  Hans P. Moravec Sensor Fusion in Certainty Grids for Mobile Robots , 1988, AI Mag..

[4]  Geoffrey E. Hinton,et al.  Learning internal representations by error propagation , 1986 .

[5]  L. Rabiner,et al.  An introduction to hidden Markov models , 1986, IEEE ASSP Magazine.

[6]  Korten Kamp,et al.  Recognizing and interpreting gestures on a mobile robot , 1996, AAAI 1996.

[7]  Michael J. Swain,et al.  Gesture recognition using the Perseus architecture , 1996, Proceedings CVPR IEEE Computer Society Conference on Computer Vision and Pattern Recognition.

[8]  Wolfram Burgard,et al.  The dynamic window approach to collision avoidance , 1997, IEEE Robotics Autom. Mag..

[9]  Liqiang Feng,et al.  Navigating Mobile Robots: Systems and Techniques , 1996 .

[10]  Dean A. Pomerleau,et al.  Neural Network Perception for Mobile Robot Guidance , 1993 .

[11]  David Kortenkamp,et al.  A mobile robot that recognizes people , 1995, Proceedings of 7th IEEE International Conference on Tools with Artificial Intelligence.

[12]  Carl F. R. Weiman,et al.  Helpmate autonomous mobile robot nav-igation system , 1991 .

[13]  Sebastian Thrun,et al.  Learning Maps for Indoor Mobile Robot Navigation. , 1996 .

[14]  Satoru Hayamizu,et al.  Socially Embedded Learning of the Office-Conversant Mobil Robot Jijo-2 , 1997, IJCAI.

[15]  Alex Waibel,et al.  Tracking Human Faces in Real-Time, , 1995 .

[16]  MoravecHans Sensor fusion in certainty grids for mobile robots , 1988 .

[17]  James L. Crowley,et al.  Vision for man machine interaction , 1995 .