Prediction of gaze estimation error for error-aware gaze-based interfaces

Gaze estimation error is inherent in head-mounted eye trackers and seriously impacts performance, usability, and user experience of gaze-based interfaces. Particularly in mobile settings, this error varies constantly as users move in front and look at different parts of a display. We envision a new class of gaze-based interfaces that are aware of the gaze estimation error and adapt to it in real time. As a first step towards this vision we introduce an error model that is able to predict the gaze estimation error. Our method covers major building blocks of mobile gaze estimation, specifically mapping of pupil positions to scene camera coordinates, marker-based display detection, and mapping of gaze from scene camera to on-screen coordinates. We develop our model through a series of principled measurements of a state-of-the-art head-mounted eye tracker.

[1]  Andreas Bulling,et al.  Computational Modelling and Prediction of Gaze Estimation Error for Head-mounted Eye Trackers , 2015 .

[2]  Antonio Krüger,et al.  GazeProjector: Accurate Gaze Estimation and Seamless Gaze Interaction Across Multiple Displays , 2015, UIST.

[3]  Päivi Majaranta,et al.  Eye Tracking and Eye-Based Human–Computer Interaction , 2014 .

[4]  Francisco José Madrid-Cuevas,et al.  Automatic generation and detection of highly reliable fiducial markers under occlusion , 2014, Pattern Recognit..

[5]  Klaus Bengler,et al.  Implementing gaze control for peripheral devices , 2011, PETMEI '11.

[6]  Juan J. Cerrolaza,et al.  Error characterization and compensation in eye tracking systems , 2012, ETRA '12.

[7]  Dan Witzner Hansen,et al.  Parallax error in the monocular head-mounted eye trackers , 2012, UbiComp.

[8]  Dan Witzner Hansen,et al.  Mobile gaze-based screen interaction in 3D environments , 2011, NGCA '11.

[9]  Marcus Nyström,et al.  Eye tracker data quality: what it is and how to measure it , 2012, ETRA.

[10]  Yusuke Sugano,et al.  Self-Calibrating Head-Mounted Eye Trackers Using Egocentric Visual Saliency , 2015, UIST.

[11]  Hans-Werner Gellersen,et al.  Toward Mobile Eye-Based Human-Computer Interaction , 2010, IEEE Pervasive Computing.

[12]  Oleg Spakov,et al.  Real-time hidden gaze point correction , 2014, ETRA.

[13]  Moshe Eizenman,et al.  A new methodology for determining point-of-gaze in head-mounted eye tracking systems , 2004, IEEE Transactions on Biomedical Engineering.

[14]  Andreas Bulling,et al.  Pupil: an open source platform for pervasive eye tracking and mobile gaze-based interaction , 2014, UbiComp Adjunct.

[15]  Oleg Spakov Comparison of eye movement filters used in HCI , 2012, ETRA '12.