A real-time tracking and optimised gaze control for a redundant humanoid robot head

This paper addresses the problem of gaze control for a humanoid robot head. We propose a control framework comprised of two components: 1) an adaptive visual tracker that is capable of tracking an arbitrary object with unknown trajectory; 2) an optimised visual control strategy capable of controlling all joint motions of a redundant head-neck mechanism in order to retain the tracked object at image centre. The advantage of such a framework is that it does not require any prior knowledge of the object trajectory and can achieve optimal joint motions, i.e. it minimises the maximum joint motions needed to maintain constant gaze. An adaptive gain has been used with the controller in order to provide dynamic convergence of the task space error when gazing at an object with unpredictable or erratic trajectories. The proposed framework has been validated in real-time using our bi-manual humanoid robot platform Boris, and the reported experimental results demonstrate the efficiency of our approach.

[1]  Ales Ude,et al.  Redundant control of a humanoid robot head with foveated vision for object tracking , 2010, 2010 IEEE International Conference on Robotics and Automation.

[2]  David Fitzpatrick,et al.  Types of Eye Movements and Their Functions , 2001 .

[3]  Henri P. Gavin,et al.  The Levenberg-Marquardt method for nonlinear least squares curve-fitting problems c © , 2013 .

[4]  François Chaumette,et al.  Visual Servoing , 2014, Computer Vision, A Reference Guide.

[5]  Valerio Ortenzi,et al.  An experimental study of robot control during environmental contacts based on projected operational space dynamics , 2014, 2014 IEEE-RAS International Conference on Humanoid Robots.

[6]  Alessandro De Luca,et al.  Adaptive predictive gaze control of a redundant humanoid robot head , 2011, 2011 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[7]  Giorgio Metta,et al.  Learning to track colored objects with log-polar vision , 2004 .

[8]  Nadia Magnenat-Thalmann,et al.  Human-Like Behavior Generation Based on Head-Arms Model for Robot Tracking External Targets and Body Parts , 2015, IEEE Transactions on Cybernetics.

[9]  François Chaumette,et al.  Visual servo control. I. Basic approaches , 2006, IEEE Robotics & Automation Magazine.

[10]  Stefano Stramigioli,et al.  Design and control of the Twente humanoid head , 2009 .

[11]  Giorgio Metta,et al.  Design of the robot-cub (iCub) head , 2006, Proceedings 2006 IEEE International Conference on Robotics and Automation, 2006. ICRA 2006..

[12]  Éric Marchand,et al.  ViSP for visual servoing: a generic software platform with a wide class of robot control skills , 2005, IEEE Robotics & Automation Magazine.

[13]  Ales Ude,et al.  The Karlsruhe Humanoid Head , 2008, Humanoids 2008 - 8th IEEE-RAS International Conference on Humanoid Robots.

[14]  Sven Behnke,et al.  Controlling the gaze direction of a humanoid robot with redundant joints , 2008, RO-MAN 2008 - The 17th IEEE International Symposium on Robot and Human Interactive Communication.

[15]  Sounkalo Dembélé,et al.  Visual servoing schemes for automatic nanopositioning under scanning electron microscope , 2014, 2014 IEEE International Conference on Robotics and Automation (ICRA).

[16]  Jianbo Su,et al.  Gaze Control on Humanoid Robot Head , 2006, 2006 6th World Congress on Intelligent Control and Automation.

[17]  Ales Leonardis,et al.  Single target tracking using adaptive clustered decision trees and dynamic multi-level appearance models , 2015, 2015 IEEE Conference on Computer Vision and Pattern Recognition (CVPR).

[18]  Gabriel Skantze,et al.  The furhat Back-Projected humanoid Head-Lip Reading, gaze and Multi-Party Interaction , 2013, Int. J. Humanoid Robotics.