A Hand-Eye-Arm Coordinated System

In this paper we present the description and experiments with a tightly coupled Hand-Eye-Arm manipulatory system. We explain the philosophy and the motivation for building a tightly coupled system that actually consists of very autonomous modules that communicate with each other via a central coordinator. We describe each of the modules in the system and their interactions with each other. We highlight the need for sensory driven manipulation, and explain how the above system, where the hand is equipped with multiple tactile sensors, is capable of both manipulating unknown objects, but also detecting and complying in the case of collisions. We explain the partition of the control of the system into various closed loops, representing coordination both at the level of gross manipulator motions as well as fine motions. We describe the various modes that the system can work in, as well as some of the experiments that are being currently performed using this system. Comments University of Pennsylvania Department of Computer and Information Science Technical Report No. MSCIS-91-05. This technical report is available at ScholarlyCommons: http://repository.upenn.edu/cis_reports/400 A Hand-Eye-Arm Coordinated System MS-CIS-91-05 GRASP LAB 250 Sanjay Agrawal Ruzena Bajcsy Vijay Kumar University of Pennsylvania School of Engineering and Applied Science Computer and Information Science Department Philadelphia, PA 19104-6389 An Hand-Eye-Arm Coordinated System MS-CIS-91-05 GRASP LAB 250 Sanjay Agrawal Ruzena Bajcsy Vijay Kumar Department of Computer and Information Science School of Engineering and Applied Science University of Pennsylvania Philadelphia, PA 19104-6389

[1]  Kenneth S. Roberts Coordinating a robot arm and multi-finger hand using the quaternion representation , 1990, Proceedings., IEEE International Conference on Robotics and Automation.

[2]  Masayuki Inaba,et al.  Design and implementation of a system that generates assembly programs from visual recognition of human action sequences , 1990, EEE International Workshop on Intelligent Robots and Systems, Towards a New Frontier of Applications.

[3]  Stephen C. Jacobsen,et al.  The UTAH/M.I.T. Dextrous Hand: Work in Progress , 1984 .

[4]  Kenneth S. Roberts,et al.  An integrated system for dextrous manipulation , 1989, Proceedings, 1989 International Conference on Robotics and Automation.

[5]  Ruzena Bajcsy,et al.  A medium-complexity compliant end effector , 1988, Proceedings. 1988 IEEE International Conference on Robotics and Automation.

[6]  R. Bajcsy,et al.  Shape recovery and segmentation with deformable part models , 1987 .

[7]  Ruzena Bajcsy,et al.  Active touch and robot perception , 1984 .

[8]  Harry E. Stephanou,et al.  A topological algorithm for continuous grasp planning , 1990, Proceedings., IEEE International Conference on Robotics and Automation.

[9]  Huan Liu,et al.  The multi-dimensional quality of task requirements for dextrous robot hand control , 1989, Proceedings, 1989 International Conference on Robotics and Automation.

[10]  John M. Hollerbach,et al.  Implementation of control methodologies on the computational architecture for the Utah/MIT hand , 1986, Proceedings. 1986 IEEE International Conference on Robotics and Automation.

[11]  Sharon A. Stansfield,et al.  Haptic Perception with an Articulated, Sensate Robot Hand , 1992, Robotica.

[12]  Ruzena Bajcsy,et al.  Finger Based Explorations , 1987, Other Conferences.

[13]  Gregory P. Starr,et al.  Optimal grasping using a multifingered robot hand , 1990, Proceedings., IEEE International Conference on Robotics and Automation.

[14]  Huan Liu,et al.  Robot hand-eye coordination: shape description and grasping , 1988, Proceedings. 1988 IEEE International Conference on Robotics and Automation.

[15]  Peter I. Corke,et al.  Video- Rate Visual Servoing for Robots , 1989, ISER.

[16]  Clifford C. Geschke A System for Programming and Controlling Sensor-Based Robot Manipulators , 1983, IEEE Transactions on Pattern Analysis and Machine Intelligence.