CalibMe: Fast and Unsupervised Eye Tracker Calibration for Gaze-Based Pervasive Human-Computer Interaction

As devices around us become smart, our gaze is poised to become the next frontier of human-computer interaction (HCI). State-of-the-art mobile eye tracker systems typically rely on eye-model-based gaze estimation approaches, which do not require a calibration. However, such approaches require specialized hardware (e.g., multiple cameras and glint points), can be significantly affected by glasses, and, thus, are not fit for ubiquitous gaze-based HCI. In contrast, regression-based gaze estimations are straightforward approaches requiring solely one eye and one scene camera but necessitate a calibration. Therefore, a fast and accurate calibration is a key development to enable ubiquitous gaze-based HCI. In this paper, we introduce CalibMe, a novel method that exploits collection markers (automatically detected fiducial markers) to allow eye tracker users to gather a large array of calibration points, remove outliers, and automatically reserve evaluation points in a fast and unsupervised manner. The proposed approach is evaluated against a nine-point calibration method, which is typically used due to its relatively short calibration time and adequate accuracy. CalibMe reached a mean angular error of 0.59 (0=0.23) in contrast to 0.82 (0=0.15) for a nine-point calibration, attesting for the efficacy of the method. Moreover, users are able to calibrate the eye tracker anywhere and independently in - 10 s using a cellphone to display the collection marker.

[1]  Qiang Ji,et al.  In the Eye of the Beholder: A Survey of Models for Eyes and Gaze , 2010, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[2]  R. Wurtz,et al.  The Neurobiology of Saccadic Eye Movements , 1989 .

[3]  Rafael Cabeza,et al.  A Novel Gaze Estimation System With One Calibration Point , 2008, IEEE Transactions on Systems, Man, and Cybernetics, Part B (Cybernetics).

[4]  Joost van de Weijer,et al.  Participants know best : The effect of calibration method on data quality , 2011 .

[5]  Pierre Gurdjos,et al.  Detection and Accurate Localization of Circular Fiducials under Highly Challenging Conditions , 2016, 2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR).

[6]  Stephen Chi-fai Chan,et al.  Building a Personalized, Auto-Calibrating Eye Tracker from User Interactions , 2016, CHI.

[7]  Jeff B. Pelz,et al.  Binocular eye tracking calibration during a virtual ball catching task using head mounted display , 2016, SAP.

[8]  Juan J. Cerrolaza,et al.  Study of Polynomial Mapping Functions in Video-Oculography Eye Trackers , 2012, TCHI.

[9]  Neil Dodgson,et al.  A fully-automatic , temporal approach to single camera , glint-free 3 D eye model fitting , 2013 .

[10]  Moshe Eizenman,et al.  A new methodology for determining point-of-gaze in head-mounted eye tracking systems , 2004, IEEE Transactions on Biomedical Engineering.

[11]  Christophe Cudel,et al.  Autocalibration-based partioning relationship and parallax relation for head-mounted eye trackers , 2012, Machine Vision and Applications.

[12]  Hans-Werner Gellersen,et al.  Pursuit calibration: making gaze calibration less tedious and more flexible , 2013, UIST.

[13]  Enkelejda Kasneci,et al.  Pupil detection for head-mounted eye tracking in the wild: an evaluation of the state of the art , 2016, Machine Vision and Applications.

[14]  Helge J. Ritter,et al.  A neural network for 3D gaze recording with binocular eye trackers , 2006, Int. J. Parallel Emergent Distributed Syst..

[15]  W. Becker The neurobiology of saccadic eye movements. Metrics. , 1989, Reviews of oculomotor research.

[16]  Carlos Hitoshi Morimoto,et al.  Eye gaze tracking techniques for interactive applications , 2005, Comput. Vis. Image Underst..

[17]  Enkelejda Kasneci,et al.  3D Gaze Estimation using Eye Vergence , 2016, HEALTHINF.

[18]  Alan Kennedy,et al.  Book Review: Eye Tracking: A Comprehensive Guide to Methods and Measures , 2016, Quarterly journal of experimental psychology.

[19]  Jeff B. Pelz,et al.  Compensating for eye tracker camera movement , 2006, ETRA.

[20]  C. Kleinke Gaze and eye contact: a research review. , 1986, Psychological bulletin.

[21]  Thiago Santini,et al.  EyeRecToo: Open-source Software for Real-time Pervasive Head-mounted Eye Tracking , 2017, VISIGRAPP.

[22]  Marcus Nyström,et al.  Post-saccadic oscillations in eye movement data recorded with pupil-based eye trackers reflect motion of the pupil inside the iris , 2013, Vision Research.

[23]  Päivi Majaranta,et al.  Eye Tracking and Eye-Based Human–Computer Interaction , 2014 .

[24]  Tim Halverson,et al.  Cleaning up systematic error in eye-tracking data by using required fixation locations , 2002, Behavior research methods, instruments, & computers : a journal of the Psychonomic Society, Inc.

[25]  D. Robinson,et al.  The upper limit of human smooth pursuit velocity , 1985, Vision Research.

[26]  Dongheng Li,et al.  openEyes: a low-cost head-mounted eye-tracking solution , 2006, ETRA.

[27]  Wolfgang Rosenstiel,et al.  Eyes wide open? eyelid location and eye aperture estimation for pervasive eye tracking in real-world scenarios , 2016, UbiComp Adjunct.

[28]  Antonio Krüger,et al.  A time-efficient re-calibration algorithm for improved long-term accuracy of head-worn eye trackers , 2016, ETRA.

[29]  Thiago Santini,et al.  Bayesian identification of fixations, saccades, and smooth pursuits , 2015, ETRA.

[30]  Yusuke Sugano,et al.  Self-Calibrating Head-Mounted Eye Trackers Using Egocentric Visual Saliency , 2015, UIST.

[31]  Enkelejda Kasneci,et al.  Rendering refraction and reflection of eyeglasses for synthetic eye tracker images , 2016, ETRA.

[32]  Karen M. Evans,et al.  Collecting and Analyzing Eye-Tracking Data in Outdoor Environments , 2012 .

[33]  Jeff B. Pelz,et al.  Building a lightweight eyetracking headgear , 2004, ETRA.

[34]  John Paulin Hansen,et al.  Evaluation of a low-cost open-source gaze tracker , 2010, ETRA.

[35]  Francisco José Madrid-Cuevas,et al.  Automatic generation and detection of highly reliable fiducial markers under occlusion , 2014, Pattern Recognit..

[36]  Mario Fritz,et al.  Appearance-based gaze estimation in the wild , 2015, 2015 IEEE Conference on Computer Vision and Pattern Recognition (CVPR).

[37]  Thiago Santini,et al.  ElSe: ellipse selection for robust pupil detection in real-world environments , 2015, ETRA.