Yet another gaze detector: An embodied calibration free system for the iCub robot

The recognition of gaze as for example mutual gaze plays an important role in social interaction. Previous research shows that already infants are capable of detecting mutual gaze. Such abilities are relevant for robots to learn from interaction, for example detecting when the robot is being addressed. Although various gaze tracking methods have been proposed, few seem to be openly available for robotic platforms such as iCub. In this paper we will describe a gaze tracking system for humanoid robots that is completely based on freely available libraries and data sets. Our system is able to estimate horizontal and vertical gaze directions using low resolution VGA images from robot embodied vision at 30 frames per second. For this purpose we developed a pupil detection algorithm combining existing approaches to increase noise robustness. Our method combines positions of the face and eye features as well as context features such as eyelid correlates and thus does not rely on fixed head orientations. An evaluation on the iCub robot shows that our method is able to estimate mutual gaze with 96% accuracy at 8° tolerance and one meter distance to the robot. The results further support that mutual gaze detection yields higher accuracy in an embodied setup compared to other configurations.

[1]  Steven K. Feiner,et al.  Gaze locking: passive eye contact detection for human-object interaction , 2013, UIST.

[2]  Upena D. Dalal,et al.  Performance Analysis of Eye localization Methods for Real Time Vision Interface using Low Grade Video Camera , 2015 .

[3]  Morton Ann Gernsbacher,et al.  On Privileging the Role of Gaze in Infant Social Cognition. , 2008, Child development perspectives.

[4]  Mark H. Johnson,et al.  Eye contact detection in humans from birth , 2002, Proceedings of the National Academy of Sciences of the United States of America.

[5]  Josephine Sullivan,et al.  One millisecond face alignment with an ensemble of regression trees , 2014, 2014 IEEE Conference on Computer Vision and Pattern Recognition.

[6]  Mohan M. Trivedi,et al.  Head Pose Estimation in Computer Vision: A Survey , 2009, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[7]  Giorgio Metta,et al.  Towards long-lived robot genes , 2008, Robotics Auton. Syst..

[8]  Nicu Sebe,et al.  Combining Head Pose and Eye Location Information for Gaze Estimation , 2012, IEEE Transactions on Image Processing.

[9]  N. Otsu A threshold selection method from gray level histograms , 1979 .

[10]  Eugenio Parise,et al.  Neural Responses to Multimodal Ostensive Signals in 5-Month-Old Infants , 2013, PloS one.

[11]  Davis E. King,et al.  Dlib-ml: A Machine Learning Toolkit , 2009, J. Mach. Learn. Res..

[12]  Zhiwei Zhu,et al.  Nonlinear Eye Gaze Mapping Function Estimation via Support Vector Regression , 2006, 18th International Conference on Pattern Recognition (ICPR'06).

[13]  Tony Belpaeme,et al.  Head Pose Estimation is an Inadequate Replacement for Eye Gaze in Child-Robot Interaction , 2015, HRI.

[14]  Qiang Ji,et al.  In the Eye of the Beholder: A Survey of Models for Eyes and Gaze , 2010, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[15]  Stefanos Zafeiriou,et al.  300 Faces in-the-Wild Challenge: The First Facial Landmark Localization Challenge , 2013, 2013 IEEE International Conference on Computer Vision Workshops.

[16]  Erhardt Barth,et al.  Accurate Eye Centre Localisation by Means of Gradients , 2011, VISAPP.

[17]  Xiaoyang Tan,et al.  Eyes closeness detection from still images with multi-scale histograms of principal oriented gradients , 2014, Pattern Recognit..

[18]  Theo Gevers,et al.  Accurate eye center location and tracking using isophote curvature , 2008, 2008 IEEE Conference on Computer Vision and Pattern Recognition.