Error-aware gaze-based interfaces for robust mobile gaze interaction

Gaze estimation error can severely hamper usability and performance of mobile gaze-based interfaces given that the error varies constantly for different interaction positions. In this work, we explore error-aware gaze-based interfaces that estimate and adapt to gaze estimation error on-the-fly. We implement a sample error-aware user interface for gaze-based selection and different error compensation methods: a naïve approach that increases component size directly proportional to the absolute error, a recent model by Feit et al. that is based on the two-dimensional error distribution, and a novel predictive model that shifts gaze by a directional error estimate. We evaluate these models in a 12-participant user study and show that our predictive model significantly outperforms the others in terms of selection rate, particularly for small gaze targets. These results underline both the feasibility and potential of next generation error-aware gaze-based user interfaces.

[1]  Dan Witzner Hansen,et al.  Mobile gaze-based screen interaction in 3D environments , 2011, NGCA '11.

[2]  Daniel Sonntag,et al.  Kognit: Intelligent Cognitive Enhancement Technology by Cognitive Models and Mixed Reality for Dementia Patients , 2015, AAAI Fall Symposia.

[3]  Andreas Nürnberger,et al.  Designing gaze-supported multimodal interactions for the exploration of large image collections , 2011, NGCA '11.

[4]  Hongbin Zha,et al.  Improving eye cursor's stability for eye pointing tasks , 2008, CHI.

[5]  Oleg Spakov,et al.  Comparison of gaze-to-objects mapping algorithms , 2011, NGCA '11.

[6]  Antonio Krüger,et al.  GazeProjector: Accurate Gaze Estimation and Seamless Gaze Interaction Across Multiple Displays , 2015, UIST.

[7]  Thomas Kieninger,et al.  Gaze guided object recognition using a head-mounted eye tracker , 2012, ETRA '12.

[8]  Jeffrey S. Shell,et al.  EyePliances: attention-seeking devices that respond to visual attention , 2003, CHI Extended Abstracts.

[9]  Dan Witzner Hansen,et al.  Parallax error in the monocular head-mounted eye trackers , 2012, UbiComp.

[10]  Meredith Ringel Morris,et al.  Toward Everyday Gaze Input: Accuracy and Precision of Eye Tracking and Implications for Design , 2017, CHI.

[11]  Yunfeng Zhang,et al.  Mode-of-disparities Error Correction of Eye-tracking Data , 2011 .

[12]  Marcus Nyström,et al.  The influence of calibration method and eye physiology on eyetracking data quality , 2013, Behavior research methods.

[13]  Pieter Blignaut,et al.  The effect of mapping function on the accuracy of a video-based eye tracker , 2013, ETSA '13.

[14]  Hans-Werner Gellersen,et al.  Pursuits: spontaneous interaction with displays based on smooth pursuit eye movement and moving targets , 2013, UbiComp.

[15]  Andreas Bulling,et al.  Prediction of gaze estimation error for error-aware gaze-based interfaces , 2016, ETRA.

[16]  Hans-Werner Gellersen,et al.  Orbits: Gaze Interaction for Smart Watches using Smooth Pursuit Eye Movements , 2015, UIST.

[17]  Anthony J. Hornof,et al.  Easy post-hoc spatial recalibration of eye tracking data , 2014, ETRA.

[18]  Oleg Spakov,et al.  Real-time hidden gaze point correction , 2014, ETRA.

[19]  Daniel Sonntag,et al.  Towards Episodic Memory Support for Dementia Patients by Recognizing Objects, Faces and Text in Eye Gaze , 2015, KI.

[20]  Jörg Müller,et al.  GazeHorizon: enabling passers-by to interact with public displays by gaze , 2014, UbiComp.

[21]  Jeremy Hales,et al.  Interacting with Objects in the Environment by Gaze and Hand Gestures , 2013 .

[22]  Raimund Dachselt,et al.  Look & touch: gaze-supported target acquisition , 2012, CHI.

[23]  Daniel Sonntag,et al.  Gaze-guided object classification using deep neural networks for attention-based computing , 2016, UbiComp Adjunct.

[24]  Klaus Bengler,et al.  Implementing gaze control for peripheral devices , 2011, PETMEI '11.

[25]  I. Scott MacKenzie,et al.  Eye gaze interaction with expanding targets , 2004, CHI EA '04.

[26]  Andreas Bulling,et al.  Pupil: an open source platform for pervasive eye tracking and mobile gaze-based interaction , 2014, UbiComp Adjunct.

[27]  Oleg Spakov Comparison of eye movement filters used in HCI , 2012, ETRA '12.

[28]  Gerhard Tröster,et al.  EyeMote - Towards Context-Aware Gaming Using Eye Movements Recorded from Wearable Electrooculography , 2008, Fun and Games.

[29]  Juan J. Cerrolaza,et al.  Error characterization and compensation in eye tracking systems , 2012, ETRA '12.

[30]  Antonio Krüger,et al.  Collaborative Newspaper: Exploring an adaptive Scrolling Algorithm in a Multi-user Reading Scenario , 2015, PerDis.

[31]  Hendrik Koesling,et al.  Entropy-based correction of eye tracking data for static scenes , 2012, ETRA.

[32]  Stefan Kohlbecher,et al.  Gaze-based interaction in various environments , 2008, VNBA '08.

[33]  Jan Drewes,et al.  Shifts in reported gaze position due to changes in pupil size: ground truth and compensation , 2012, ETRA '12.

[34]  Hans-Werner Gellersen,et al.  Cross-device gaze-supported point-to-point content transfer , 2014, ETRA.

[35]  Akito Monden,et al.  Evaluation of gaze-added target selection methods suitable for general GUIs , 2005, Int. J. Comput. Appl. Technol..

[36]  Yanxia Zhang,et al.  SideWays: a gaze interface for spontaneous interaction with situated displays , 2013, CHI.

[37]  Marcus Nyström,et al.  Improving the Accuracy of Video-Based Eye-Tracking in Real-Time through Post-Calibration Regression , 2014 .

[38]  Gaël Varoquaux,et al.  Scikit-learn: Machine Learning in Python , 2011, J. Mach. Learn. Res..

[39]  Moshe Eizenman,et al.  A new methodology for determining point-of-gaze in head-mounted eye tracking systems , 2004, IEEE Transactions on Biomedical Engineering.

[40]  Nicolas Roussel,et al.  1 € filter: a simple speed-based low-pass filter for noisy input in interactive systems , 2012, CHI.