Accurate Regression-Based 3D Gaze Estimation Using Multiple Mapping Surfaces

Accurate 3D gaze estimation using a simple setup remains a challenging issue for head-mounted eye tracking. Current regression-based gaze direction estimation methods implicitly assume that all gaze directions intersect at one point called the eyeball pseudo-center. The effect of this implicit assumption on gaze estimation is unknown. In this paper, we find that this assumption is approximate based on a simulation of all intersections of gaze directions, and it is conditional based on a sensitivity analysis of the assumption in gaze estimation. Hence, we propose a gaze direction estimation method with one mapping surface that satisfies conditions of the assumption by configuring one mapping surface and achieving a high-quality calibration of the eyeball pseudo-center. This method only adds two additional calibration points outside the mapping surface. Furthermore, replacing the eyeball pseudo-center with an additional calibrated surface, we propose a gaze direction estimation method with two mapping surfaces that further improves the accuracy of gaze estimation. This method improves accuracy on the state-of-the-art method by 20 percent (from a mean error of 1.84 degrees to 1.48 degrees) on a public dataset with a usage range of 1 meter and by 17 percent (from a mean error of 2.22 degrees to 1.85 degrees) on a public dataset with a usage range of 2 meters.

[1]  Moshe Eizenman,et al.  General theory of remote gaze estimation using the pupil center and corneal reflections , 2006, IEEE Transactions on Biomedical Engineering.

[2]  Songpo Li,et al.  3-D-Gaze-Based Robotic Grasping Through Mimicking Human Visuomotor Function for People With Motion Impairments , 2017, IEEE Transactions on Biomedical Engineering.

[3]  Yusuke Sugano,et al.  3D gaze estimation from 2D pupil positions on monocular head-mounted eye trackers , 2016, ETRA.

[4]  Masatsugu Kidode,et al.  Estimation of 3D gazed position using view lines , 2003, 12th International Conference on Image Analysis and Processing, 2003.Proceedings..

[5]  Mario Fritz,et al.  Appearance-based gaze estimation in the wild , 2015, 2015 IEEE Conference on Computer Vision and Pattern Recognition (CVPR).

[6]  W W Abbott,et al.  Ultra-low-cost 3D gaze estimation: an intuitive high information throughput compliment to direct brain–machine interfaces , 2012, Journal of neural engineering.

[7]  Kang Ryoung Park,et al.  3D gaze tracking method using Purkinje images on eye optical model and pupil , 2012 .

[8]  Rafael Cabeza,et al.  A Novel Gaze Estimation System With One Calibration Point , 2008, IEEE Transactions on Systems, Man, and Cybernetics, Part B (Cybernetics).

[9]  Xiaoli Zhang,et al.  Implicit Intention Communication in Human–Robot Interaction Through Visual Behavior Studies , 2017, IEEE Transactions on Human-Machine Systems.

[10]  Andreas Bulling,et al.  Computational Modelling and Prediction of Gaze Estimation Error for Head-mounted Eye Trackers , 2015 .

[11]  M. Bergamasco,et al.  A New Gaze-BCI-Driven Control of an Upper Limb Exoskeleton for Rehabilitation in Real-World Tasks , 2012, IEEE Transactions on Systems, Man, and Cybernetics, Part C (Applications and Reviews).

[12]  A. Sugimoto,et al.  Active wearable vision sensor: recognition of human activities and environments , 2004, International Conference on Informatics Research for Development of Knowledge Society Infrastructure, 2004. ICKS 2004..

[13]  Tsukasa Ogasawara,et al.  Estimating 3-D Point-of-Regard in a Real Environment Using a Head-Mounted Eye-Tracking System , 2014, IEEE Transactions on Human-Machine Systems.

[14]  David A. Atchison,et al.  Optics of the Human Eye , 2023 .

[15]  Wojciech Matusik,et al.  Gaze360: Physically Unconstrained Gaze Estimation in the Wild , 2019, 2019 IEEE/CVF International Conference on Computer Vision (ICCV).

[16]  Zhiwei Zhu,et al.  Novel Eye Gaze Tracking Techniques Under Natural Head Movement , 2007, IEEE Transactions on Biomedical Engineering.

[17]  Juan J. Cerrolaza,et al.  Study of Polynomial Mapping Functions in Video-Oculography Eye Trackers , 2012, TCHI.

[18]  Jose Sigut,et al.  Iris Center Corneal Reflection Method for Gaze Tracking Using Visible Light , 2011, IEEE Transactions on Biomedical Engineering.

[19]  Dan Witzner Hansen,et al.  Parallax error in the monocular head-mounted eye trackers , 2012, UbiComp.

[20]  Shigang Li,et al.  Gaze Estimation From Color Image Based on the Eye Model With Known Head Pose , 2016, IEEE Transactions on Human-Machine Systems.

[21]  Hao Chen,et al.  Cross-Validated Locally Polynomial Modeling for 2-D/3-D Gaze Tracking With Head-Worn Devices , 2020, IEEE Transactions on Industrial Informatics.

[22]  Qiang Ji,et al.  In the Eye of the Beholder: A Survey of Models for Eyes and Gaze , 2010, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[23]  Pieter Blignaut,et al.  Mapping the Pupil-Glint Vector to Gaze Coordinates in a Simple Video-Based Eye Tracker , 2013 .

[24]  Neil Dodgson,et al.  A fully-automatic , temporal approach to single camera , glint-free 3 D eye model fitting , 2013 .

[25]  Aleksandra Kaszowska,et al.  Software Architecture for Automating Cognitive Science Eye-Tracking Data Analysis and Object Annotation , 2019, IEEE Transactions on Human-Machine Systems.

[26]  Roderick E. Darby,et al.  Medical Physiology and Biophysics , 1961 .

[27]  Yoichi Sato,et al.  Gaze Estimation by Exploring Two-Eye Asymmetry , 2020, IEEE Transactions on Image Processing.

[28]  Qinmu Peng,et al.  Eye Gaze Tracking With a Web Camera in a Desktop Environment , 2015, IEEE Transactions on Human-Machine Systems.

[29]  Pushkar Shukla,et al.  3D gaze estimation in the scene volume with a head-mounted eye tracker , 2018, COGAIN@ETRA.