A comparison of geometric- and regression-based mobile gaze-tracking

Video-based gaze-tracking systems are typically restricted in terms of their effective tracking space. This constraint limits the use of eyetrackers in studying mobile human behavior. Here, we compare two possible approaches for estimating the gaze of participants who are free to walk in a large space whilst looking at different regions of a large display. Geometrically, we linearly combined eye-in-head rotations and head-in-world coordinates to derive a gaze vector and its intersection with a planar display, by relying on the use of a head-mounted eyetracker and body-motion tracker. Alternatively, we employed Gaussian process regression to estimate the gaze intersection directly from the input data itself. Our evaluation of both methods indicates that a regression approach can deliver comparable results to a geometric approach. The regression approach is favored, given that it has the potential for further optimization, provides confidence bounds for its gaze estimates and offers greater flexibility in its implementation. Open-source software for the methods reported here is also provided for user implementation.

[1]  H. Collewijn,et al.  The function of visual search and memory in sequential looking tasks , 1995, Vision Research.

[2]  K Nakayama PHOTOGRAPHIC DETERMINATION OF THE ROTATIONAL STATE OF THE EYE USING MATRICES* , 1974, American journal of optometry and physiological optics.

[3]  Thierry Baccino,et al.  Eye-Fixation-Related Potentials: Insight into Parafoveal Processing , 2005 .

[4]  M. Hestenes,et al.  Methods of conjugate gradients for solving linear systems , 1952 .

[5]  Francesco Lacquaniti,et al.  A novel method for measuring gaze orientation in space in unrestrained head conditions. , 2013, Journal of vision.

[6]  Christopher K. I. Williams,et al.  Gaussian Processes for Machine Learning (Adaptive Computation and Machine Learning) , 2005 .

[7]  W P Medendorp,et al.  Human gaze stabilization during active head translations. , 2002, Journal of neurophysiology.

[8]  A. L. Yarbus,et al.  Saccadic Eye Movements , 1967 .

[9]  Davis E. King,et al.  Dlib-ml: A Machine Learning Toolkit , 2009, J. Mach. Learn. Res..

[10]  Alan Kennedy,et al.  Book Review: Eye Tracking: A Comprehensive Guide to Methods and Measures , 2016, Quarterly journal of experimental psychology.

[11]  Olivier White,et al.  Computation of gaze orientation under unrestrained head movements , 2007, Journal of Neuroscience Methods.

[12]  D. Robinson,et al.  A METHOD OF MEASURING EYE MOVEMENT USING A SCLERAL SEARCH COIL IN A MAGNETIC FIELD. , 1963, IEEE transactions on bio-medical engineering.

[13]  H. Collewijn,et al.  Human ocular counterroll: assessment of static and dynamic properties from electromagnetic scleral coil recordings , 2004, Experimental Brain Research.

[14]  M. Powell The BOBYQA algorithm for bound constrained optimization without derivatives , 2009 .

[15]  F Jagla,et al.  Saccadic eye movement related potentials. , 2007, Physiological research.

[16]  Masaaki Kawahashi,et al.  Renovation of Journal of Visualization , 2010, J. Vis..

[17]  Li Liu,et al.  Calibration algorithm for eyetracking with unrestricted head movement , 2007, Behavior research methods.

[18]  Steven T. Moore,et al.  A geometric basis for measurement of three-dimensional eye position using image processing , 1996, Vision Research.

[19]  Andrew T. Duchowski,et al.  Eye Tracking Methodology: Theory and Practice , 2003, Springer London.