Eye gaze tracking for a humanoid robot

Humans use eye gaze in their daily interaction with other humans. Humanoid robots, on the other hand, have not yet taken full advantage of this form of implicit communication. In this paper we present a passive monocular gaze tracking system implemented on the iCub humanoid robot. The validation of the system proved that it is a viable low-cost, calibration-free gaze tracking solution for humanoid platforms, with a mean absolute error of about 5 degrees on horizontal angle estimates. We also demonstrated the applicability of our system to human-robot collaborative tasks, showing that the eye gaze reading ability can enable successful implicit communication between humans and the robot. Finally, in the conclusion we give generic guidelines on how to improve our system and discuss some potential applications of gaze estimation for humanoid robots.

[1]  Gabriel Skantze,et al.  Perception of gaze direction for situated interaction , 2012, Gaze-In '12.

[2]  Josephine Sullivan,et al.  One millisecond face alignment with an ensemble of regression trees , 2014, 2014 IEEE Conference on Computer Vision and Pattern Recognition.

[3]  Giulio Sandini,et al.  The iCub humanoid robot: an open platform for research in embodied cognition , 2008, PerMIS.

[4]  Sean Andrist,et al.  Conversational Gaze Aversion for Humanlike Robots , 2014, 2014 9th ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[5]  Linden J. Ball,et al.  Eye tracking in HCI and usability research. , 2006 .

[6]  Frank Broz,et al.  Mutual gaze, personality, and familiarity: Dual eye-tracking during conversation , 2012, 2012 IEEE RO-MAN: The 21st IEEE International Symposium on Robot and Human Interactive Communication.

[7]  D. Povinelli,et al.  Mindblindness. An Essay on Autism and Theory of Mind Simon Baron-Cohen 1995 , 1996, Trends in Neurosciences.

[8]  Alessandra Sciutti,et al.  Anticipatory gaze in human-robot interactions , 2012 .

[9]  Paul A. Viola,et al.  Robust Real-Time Face Detection , 2001, International Journal of Computer Vision.

[10]  Steven K. Feiner,et al.  Gaze locking: passive eye contact detection for human-object interaction , 2013, UIST.

[11]  Anibal Gutierrez,et al.  Comparing the gaze responses of children with autism and typically developed individuals in human-robot interaction , 2014, 2014 IEEE-RAS International Conference on Humanoid Robots.

[12]  Tsukasa Ogasawara,et al.  Humanoid with Interaction Ability Using Vision and Speech Information , 2006, 2006 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[13]  Mark H. Johnson,et al.  Eye contact detection in humans from birth , 2002, Proceedings of the National Academy of Sciences of the United States of America.

[14]  Alexander Zelinsky,et al.  An algorithm for real-time stereo vision implementation of head pose and gaze direction measurement , 2000, Proceedings Fourth IEEE International Conference on Automatic Face and Gesture Recognition (Cat. No. PR00580).

[15]  Giorgio Metta,et al.  Design of the robot-cub (iCub) head , 2006, Proceedings 2006 IEEE International Conference on Robotics and Automation, 2006. ICRA 2006..

[16]  Peter Robinson,et al.  3D Constrained Local Model for rigid and non-rigid facial tracking , 2012, 2012 IEEE Conference on Computer Vision and Pattern Recognition.

[17]  John T. McConville,et al.  Anthropometric Survey of U.S. Army Personnel: Methods and Summary Statistics 1988 , 1989 .

[18]  Giulio Sandini,et al.  A Gaze-contingent Dictating Robot to Study Turn-taking , 2015, HRI.

[19]  Claire C. Gordon,et al.  2012 Anthropometric Survey of U.S. Army Personnel: Methods and Summary Statistics , 2014 .

[20]  Mohamed Chetouani,et al.  Robot initiative in a team learning task increases the rhythm of interaction but not the perceived engagement , 2014, Front. Neurorobot..

[21]  Elizabeth S. Kim,et al.  Social Robots as Embedded Reinforcers of Social Behavior in Children with Autism , 2012, Journal of Autism and Developmental Disorders.

[22]  Takahiro Ishikawa,et al.  Passive driver gaze tracking with active appearance models , 2004 .

[23]  N. George,et al.  Facing the gaze of others , 2008, Neurophysiologie Clinique/Clinical Neurophysiology.

[24]  J. Triesch,et al.  A robotic model of the development of gaze following , 2008, 2008 7th IEEE International Conference on Development and Learning.

[25]  Daniel F. Parks,et al.  Complementary effects of gaze direction and early saliency in guiding fixations during free viewing. , 2014, Journal of vision.

[26]  Mario Fritz,et al.  Appearance-based gaze estimation in the wild , 2015, 2015 IEEE Conference on Computer Vision and Pattern Recognition (CVPR).

[27]  F. Kaplan,et al.  The challenges of joint attention , 2006 .

[28]  Davis E. King,et al.  Dlib-ml: A Machine Learning Toolkit , 2009, J. Mach. Learn. Res..

[29]  Bilge Mutlu,et al.  How social distance shapes human-robot interaction , 2014, Int. J. Hum. Comput. Stud..

[30]  Trevor Darrell,et al.  Recognizing gaze aversion gestures in embodied conversational discourse , 2006, ICMI '06.

[31]  Brian Scassellati,et al.  Active Learning of Joint Attention , 2006, 2006 6th IEEE-RAS International Conference on Humanoid Robots.