Adaptive visual pursuit involving eye-head coordination and prediction of the target motion

Nowadays, increasingly complex robots are being designed. As the complexity of robots increases, traditional methods for robotic control fail, as the problem of finding the appropriate kinematic functions can easily become intractable. For this reason the use of neuro-controllers, controllers based on machine learning methods, has risen at a rapid pace. This kind of controllers are especially useful in the field of humanoid robotics, where it is common for the robot to perform hard tasks in a complex environment. A basic task for a humanoid robot is to visually pursue a target using eye-head coordination. In this work we present an adaptive model based on a neuro-controller for visual pursuit. This model allows the robot to follow a moving target with no delay (zero phase lag) using a predictor of the target motion. The results show that the new controller can reach a target posed at a starting distance of 1.2 meters in less than 100 control steps (1 second) and it can follow a moving target at low to medium frequencies (0.3 to 0.5 Hz) with zero-lag and small position error (less then 4 cm along the main motion axis). The controller also has adaptive capabilities, being able to reach and follow a target even when some joints of the robot are clamped.

[1]  Greg Welch,et al.  An Introduction to Kalman Filter , 1995, SIGGRAPH 2001.

[2]  Bernd Fritzke,et al.  A Growing Neural Gas Network Learns Topologies , 1994, NIPS.

[3]  Dan O. Popa,et al.  Robot Head Motion Control with an Emphasis on Realism of Neck–Eye Coordination during Object Tracking , 2011, J. Intell. Robotic Syst..

[4]  Stefan Schaal,et al.  Biomimetic Oculomotor Control , 2001, Adapt. Behav..

[5]  Emilio Bizzi,et al.  The coordination of eye and head movement during smooth pursuit , 1978, Brain Research.

[6]  Paolo Dario,et al.  Effective emotional expressions with expression humanoid robot WE-4RII: integration of humanoid robot hand RCH-1 , 2004, 2004 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS) (IEEE Cat. No.04CH37566).

[7]  Paolo Dario,et al.  Implementation of a bio-inspired visual tracking model on the iCub robot , 2010, 19th International Symposium in Robot and Human Interactive Communication.

[8]  Marco Antonelli,et al.  Bayesian multimodal integration in a robot replicating human head and eye movements , 2014, 2014 IEEE International Conference on Robotics and Automation (ICRA).

[9]  Paolo Dario,et al.  A Robotic Head Neuro-controller Based on Biologically-Inspired Neural Models , 2005, Proceedings of the 2005 IEEE International Conference on Robotics and Automation.

[10]  J. Santos-Victor,et al.  Behaviour Based Oculomotor Control Architecturefor Stereo Heads , 1995 .

[11]  M. V. Velzen,et al.  Self-organizing maps , 2007 .

[12]  Atsuo Takanishi,et al.  Development of an anthropomorphic head-eye robot with two eyes-coordinated head-eye motion and pursuing motion in the depth direction , 1997, Proceedings of the 1997 IEEE/RSJ International Conference on Intelligent Robot and Systems. Innovative Robotics for Real-World Applications. IROS '97.

[13]  Giorgio Metta,et al.  Design of the robot-cub (iCub) head , 2006, Proceedings 2006 IEEE International Conference on Robotics and Automation, 2006. ICRA 2006..

[14]  G. Barnes,et al.  The mechanism of prediction in human smooth pursuit eye movements. , 1991, The Journal of physiology.

[15]  J. L. Roux An Introduction to the Kalman Filter , 2003 .

[16]  Bertram E. Shi,et al.  Active Vision During Coordinated Head/Eye Movements in a Humanoid Robot , 2012, IEEE Transactions on Robotics.

[17]  S. Grossberg,et al.  Adaptive pattern classification and universal recoding: I. Parallel development and coding of neural feature detectors , 1976, Biological Cybernetics.