The Essential Components of Human-Friendly Robot Systems

To develop human friendly robots we required two key components; visual interfaces and safe mechanisms. Visual interfaces facilitate natural and easy interfaces for human-robot interaction. Facial gestures can be a natural way to control a robot. In this paper, we report on a vision-based interface that in real-time tracks a user's facial features and gaze point. Human friendly robots must also have high integrity safety systems that ensure that people are never harmed. To guarantee human safety we require manipulator mechanisms in which all actuators are force controlled in a manner that prevents dangerous impacts with people and the environment. In this paper we report on a control scheme for the Barrett-MIT whole arm manipulator (WAM) which allows people to safely interact with the robot.

[1]  Phillip J. McKerrow,et al.  Introduction to robotics , 1991 .

[2]  J. Kenneth Salisbury,et al.  Mechanical Design for Whole-Arm Manipulation , 1993 .

[3]  Alex Pentland,et al.  Visually Controlled Graphics , 1993, IEEE Trans. Pattern Anal. Mach. Intell..

[4]  Ian D. Walker,et al.  Impact configurations and measures for kinematically redundant and multiple armed robot systems , 1994, IEEE Trans. Robotics Autom..

[5]  Michael J. Black,et al.  Tracking and recognizing rigid and non-rigid facial motions using local parametric models of image motion , 1995, Proceedings of IEEE International Conference on Computer Vision.

[6]  George C. Stockman,et al.  Controlling a computer via facial aspect , 1995, IEEE Trans. Syst. Man Cybern..

[7]  Roberto Cipolla,et al.  Fast visual tracking by temporal consensus , 1996, Image Vis. Comput..

[8]  Alexander Zelinsky,et al.  Real-time visual recognition of facial gestures for human-computer interaction , 1996, Proceedings of the Second International Conference on Automatic Face and Gesture Recognition.

[9]  Alex Waibel,et al.  Tracking Eyes and Monitoring Eye Gaze , 1997 .

[10]  Masayuki Inaba,et al.  Real-time color stereo vision system for a mobile robot based on field multiplexing , 1997, Proceedings of International Conference on Robotics and Automation.

[11]  Alexander Zelinsky,et al.  3-D facial pose and gaze point estimation using a robust real-time tracking paradigm , 1998, Proceedings Third IEEE International Conference on Automatic Face and Gesture Recognition.

[12]  Stanley T. Birchfield,et al.  Elliptical head tracking using intensity gradients and color histograms , 1998, Proceedings. 1998 IEEE Computer Society Conference on Computer Vision and Pattern Recognition (Cat. No.98CB36231).

[13]  Kentaro Toyama,et al.  “Look, Ma – No Hands!” Hands-Free Cursor Control with Real-Time 3D Face Tracking , 1998 .