Bio-inspired control of eye-head coordination in a robotic anthropomorphic head

In this paper we address the problem of executing fast gaze shifts toward a visual target with a robotic platform. The robotic platform is represented by an anthropomorphic head with seven degrees of freedom (DOFs) that was designed to mimic the physical dimensions (i.e. geometry and masses), the performances (i.e. angle and velocities) and the functional abilities (i.e. neck-movements and eyes vergence) of the human head. In the proposed approach the problem of coordinating and exploiting a fast gaze shift is investigated by inserting the knowledge of long-lasting neuro-physiologic studies into the control paradigm of the robot. The major advantage of this approach is that the problem of controlling the robotic artifact is reformulated as a sensory-motor integration problem. In this case, the design of the robot control itself takes great advantages of the existing neuroscientific knowledge in the field. In this approach the "golden performance" of the robotic head is represented by the accurate eye-head coordination that is observed during head-free gaze saccades in humans. To this aim, we implemented and tested on the robotic head a well-characterized, biologically inspired model of gaze control and we verified if the resulting motor output were coherent with the reported patterns of eye-head coordination in humans

[1]  Naoki Hirai,et al.  Reaction times of head movements occurring in association with express saccades during human gaze shifts , 1998, Neuroscience Letters.

[2]  Philippe Lefèvre,et al.  Experimental study and modeling of vestibulo-ocular reflex modulation during large shifts of gaze in humans , 2004, Experimental Brain Research.

[3]  T Vilis,et al.  Eye-head coordination during large gaze shifts. , 1995, Journal of neurophysiology.

[4]  E. Bizzi,et al.  Eye-Head Coordination in Monkeys: Evidence for Centrally Patterned Organization , 1971, Science.

[5]  G. Mettaa,et al.  A developmental approach to visually-guided reaching in artificial systems , 1999 .

[6]  Silvestro Micera,et al.  Robotics as a future and emerging technology: biomimetics, cybernetics, and neuro-robotics in European projects , 2005, IEEE Robotics & Automation Magazine.

[7]  Yasuo Kuniyoshi,et al.  Neural learning of embodied interaction dynamics , 1998, Neural Networks.

[8]  D. Guitton Control of eye—head coordination during orienting gaze shifts , 1992, Trends in Neurosciences.

[9]  M. Kawato Robotics as a Tool for Neuroscience: Cerebellar Internal Models for Robotics and Cognition , 2000 .

[10]  D. Robinson,et al.  The vestibulo‐ocular reflex during human saccadic eye movements. , 1986, The Journal of physiology.

[11]  A. Opstal,et al.  Human eye-head coordination in two dimensions under different sensorimotor conditions , 1997, Experimental Brain Research.

[12]  D. Guitton,et al.  Gaze control in humans: eye-head coordination during orienting movements to targets within and beyond the oculomotor range. , 1987, Journal of neurophysiology.

[13]  B. Webb,et al.  Can robots make good models of biological behaviour? , 2001, Behavioral and Brain Sciences.

[14]  D. Munoz,et al.  Gaze control in the cat: studies and modeling of the coupling between orienting eye and head movements in different behavioral tasks. , 1990, Journal of neurophysiology.

[15]  Randall D. Beer,et al.  Biologically inspired approaches to robotics: what can we learn from insects? , 1997, CACM.

[16]  Michael C. Dorris,et al.  Combined eye-head gaze shifts to visual and auditory targets in humans , 1996, Experimental Brain Research.