Integration of brain-like computational structure and infant behaviorial pattern for robotic hand-eye coordination

Robotic hand-eye coordination plays an important role in dealing with real time environment; and the learning procedure of this skill affects the fundamental framework of robotic cognition. This paper introduces a novel developmental approach to hand-eye coordination in an autonomous robotic system. Existing work employs neural network models to map visual perception to hand. In the approach, a computational structure and a cross-modal link mechanism are applied to simulate brain cortices; and a movement pattern inspired by infant behaviors is designed to help robot learn to build its hand-eye coordination. This work is supported by experimental evaluation, which shows that the learning algorithm provides a fast and incremental learning of behavioral competence.

[1]  Laxmidhar Behera,et al.  Visual servoing of redundant manipulator with Jacobian matrix estimation using self-organizing map , 2010, Robotics Auton. Syst..

[2]  Mark H. Lee,et al.  A developmental algorithm for ocular-motor coordination , 2010, Robotics Auton. Syst..

[3]  Peter Ford Dominey,et al.  Robot Cognitive Control with a Neurophysiologically Inspired Reinforcement Learning Model , 2011, Front. Neurorobot..

[4]  Giulio Sandini,et al.  Autonomous learning of 3D reaching in a humanoid robot , 2007, 2007 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[5]  Masaki Ogino,et al.  Cognitive Developmental Robotics: A Survey , 2009, IEEE Transactions on Autonomous Mental Development.

[6]  Vincent P Ferrera,et al.  Neuronal responses to moving targets in monkey frontal eye fields. , 2008, Journal of neurophysiology.

[7]  Mark H. Lee,et al.  Robot Competence Development by Constructive Learning , 2010 .

[8]  Min Jiang,et al.  Robotic 3D reaching through a development-driven double neural network architecture , 2011 .

[9]  Marco Antonelli,et al.  Implicit Sensorimotor Mapping of the Peripersonal Space by Gazing and Reaching , 2011, IEEE Transactions on Autonomous Mental Development.

[10]  Martin Hülse,et al.  Developmental robotics architecture for active vision and reaching , 2011, 2011 IEEE International Conference on Development and Learning (ICDL).

[11]  Yale E. Cohen,et al.  A common reference frame for movement plans in the posterior parietal cortex , 2002, Nature Reviews Neuroscience.

[12]  Alexander Stoytchev,et al.  Some Basic Principles of Developmental Robotics , 2009, IEEE Transactions on Autonomous Mental Development.

[13]  Mark H. Lee,et al.  Integration of Active Vision and Reaching From a Developmental Robotics Perspective , 2010, IEEE Transactions on Autonomous Mental Development.

[14]  Juyang Weng,et al.  Symbolic Models and Emergent Models: A Review , 2012, IEEE Transactions on Autonomous Mental Development.

[15]  Giulio Sandini,et al.  Autonomous Online Learning of Reaching Behavior in a humanoid Robot , 2012, Int. J. Humanoid Robotics.

[16]  Fei Chao,et al.  An autonomous developmental learning approach for robotic eye-hand coordination. , 2009 .