Probabilistic approach to robust wearable gaze tracking

This paper presents a method for computing the gaze point using camera data captured with a wearable gaze tracking device. The method utilizes a physical model of the human eye, advanced Bayesian computer vision algorithms, and Kalman filtering, resulting in high accuracy and low noise. Our C++ implementation can process camera streams with 30 frames per second in realtime. The performance of the system is validated in an exhaustive experimental setup with 19 participants, using a self-made device. Due to the used eye model and binocular cameras, the system is accurate for all distances and invariant to device movement. We also test our system against a best-in-class commercial device which is outperformed for spatial accuracy and precision. The software and hardware instructions as well as the experimental data are published as open source.

[1]  Zhiwei Zhu,et al.  Robust real-time eye detection and tracking under variable lighting conditions and various face orientations , 2005, Comput. Vis. Image Underst..

[2]  Sheng-Wen Shih,et al.  A novel approach to 3-D gaze tracking using stereo cameras , 2004, IEEE Trans. Syst. Man Cybern. Part B.

[3]  Moshe Eizenman,et al.  General theory of remote gaze estimation using the pupil center and corneal reflections , 2006, IEEE Transactions on Biomedical Engineering.

[4]  Andreas Bulling,et al.  Pupil: an open source platform for pervasive eye tracking and mobile gaze-based interaction , 2014, UbiComp Adjunct.

[5]  S. Liversedge,et al.  A comparative analysis of vertical and horizontal fixation disparity in sentence reading , 2015, Vision Research.

[6]  Andrew T. Duchowski,et al.  Limbus/pupil switching for wearable eye tracking under variable lighting conditions , 2008, ETRA.

[7]  John Paulin Hansen,et al.  Evaluation of a low-cost open-source gaze tracker , 2010, ETRA.

[8]  Andrew W. Fitzgibbon,et al.  A Buyer's Guide to Conic Fitting , 1995, BMVC.

[9]  Oleg V. Komogortsev,et al.  Kalman Filtering in the Design of Eye-Gaze-Guided Computer Interfaces , 2007, HCI.

[10]  Kristian Lukander,et al.  Improving Model-Based Mobile Gaze Tracking , 2015, KES-IDT.

[11]  Qiang Ji,et al.  A Probabilistic Approach to Online Eye Gaze Tracking Without Explicit Personal Calibration , 2015, IEEE Transactions on Image Processing.

[12]  Alan Kennedy,et al.  Book Review: Eye Tracking: A Comprehensive Guide to Methods and Measures , 2016, Quarterly journal of experimental psychology.

[13]  Peter D. Lawrence,et al.  A single camera eye-gaze tracking system with free head motion , 2006, ETRA.

[14]  Jouko Lampinen,et al.  Sequential Monte Carlo for Bayesian matching of objects with occlusions , 2006, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[15]  Qiang Ji,et al.  In the Eye of the Beholder: A Survey of Models for Eyes and Gaze , 2010, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[16]  Jeremy Hales,et al.  Interacting with Objects in the Environment by Gaze and Hand Gestures , 2013 .

[17]  Nando de Freitas,et al.  An Introduction to Sequential Monte Carlo Methods , 2001, Sequential Monte Carlo Methods in Practice.

[18]  Laura Chamberlain Eye Tracking Methodology; Theory and Practice , 2007 .

[19]  Sharman Jagadeesan,et al.  OMG!: a new robust, wearable and affordable open source mobile gaze tracker , 2013, MobileHCI '13.

[20]  D. Ballard,et al.  Eye movements in natural behavior , 2005, Trends in Cognitive Sciences.

[21]  J. Lampinen,et al.  Incremental object matching and detection with Bayesian methods and particle filters , 2011 .

[22]  Dongheng Li,et al.  openEyes: a low-cost head-mounted eye-tracking solution , 2006, ETRA.