Self-organization of reaching operation in a robot: a bio-mimetic approach

Recently a new approach, called, behavior based robotics, has been proposed to overcome the difficulties of the traditional AI approach to robotics. The approach has two goals: an engineering goal of building a flexible autonomous robot, and a scientific goal of understanding human cognition. As one of behavior based robotic approaches, this paper deals with the reaching operation development in a simulated robot. Based on the importance of the contraction of muscles controlling body parts such as binocular eye balls, a neck and an arm, the reaching in the robot is modeled and learned via a learn-by-doing manner. And the learning result is stored in a memory as a form of experience of the muscle contraction, which is considered to be useful as a critical cue for space perception by the vision system and for arm reaching.

[1]  Michael Shadlen,et al.  Look but don't touch, or vice versa , 1997, Nature.

[2]  Jae-Moon Chung,et al.  An anthropomorphic binocular-vision planning for grasping parts by robots , 1995, Proceedings. IEEE International Symposium on Assembly and Task Planning.

[3]  C von Hofsten,et al.  The role of convergence in visual space perception. , 1976, Vision research.

[4]  Michael Kuperstein,et al.  Adaptive visual-motor coordination in multijoint robots using parallel architecture , 1987, Proceedings. 1987 IEEE International Conference on Robotics and Automation.

[5]  Fumio Miyazaki,et al.  Visual servoing based on the use of binocular visual space , 1996, Proceedings of IEEE/RSJ International Conference on Intelligent Robots and Systems. IROS '96.

[6]  Tracy L. Faber,et al.  Role of posterior parietal cortex in the recalibration of visually guided reaching , 1996, Nature.

[7]  R. Andersen,et al.  Coding of intention in the posterior parietal cortex , 1997, Nature.

[8]  Rodney A. Brooks,et al.  Building brains for bodies , 1995, Auton. Robots.

[9]  Barbara Webb,et al.  A Cricket Robot , 1996 .

[10]  H Collewijn,et al.  Binocular eye movements and the perception of depth. , 1990, Reviews of oculomotor research.

[11]  Michael A. Arbib,et al.  Schema-Theoretic Models of Arm, Hand, and Eye Movements , 1993 .

[12]  Stefano Nolfi,et al.  How to Evolve Autonomous Robots: Different Approaches in Evolutionary Robotics , 1994 .

[13]  Roger Y. Tsai,et al.  Techniques for Calibration of the Scale Factor and Image Center for High Accuracy 3-D Machine Vision Metrology , 1988, IEEE Trans. Pattern Anal. Mach. Intell..

[14]  Skavenski Aa,et al.  Eye movement and visual localization of objects in space. , 1990 .

[15]  James S. Albus,et al.  New Approach to Manipulator Control: The Cerebellar Model Articulation Controller (CMAC)1 , 1975 .

[16]  Masatoshi Kawai,et al.  Development of reaching behavior from 9 to 36 months , 1987 .

[17]  Alexandre Pouget,et al.  Perceived geometrical relationships affected by eye-movement signals , 1997, Nature.

[18]  D. Simons,et al.  Early experience of tactile stimulation influences organization of somatic sensory cortex , 1987, Nature.

[19]  M Kuperstein,et al.  Neural model of adaptive hand-eye coordination for single postures. , 1988, Science.

[20]  James S. Albus,et al.  I A New Approach to Manipulator Control: The I Cerebellar Model Articulation Controller , 1975 .

[21]  M. Arbib Coordinated control programs for movements of the hand , 1985 .

[22]  C. Hofsten,et al.  The role of convergence in visual space perception , 1976, Vision Research.

[23]  Alexandre Bernardino,et al.  Vergence control for robotic heads using log-polar images , 1996, Proceedings of IEEE/RSJ International Conference on Intelligent Robots and Systems. IROS '96.

[24]  John McCarthy What has AI in Common with Philosophy? , 1995, IJCAI.