Building a Personalized, Auto-Calibrating Eye Tracker from User Interactions

We present PACE, a Personalized, Automatically Calibrating Eye-tracking system that identifies and collects data unobtrusively from user interaction events on standard computing systems without the need for specialized equipment. PACE relies on eye/facial analysis of webcam data based on a set of robust geometric gaze features and a two-layer data validation mechanism to identify good training samples from daily interaction data. The design of the system is founded on an in-depth investigation of the relationship between gaze patterns and interaction cues, and takes into consideration user preferences and habits. The result is an adaptive, data-driven approach that continuously recalibrates, adapts and improves with additional use. Quantitative evaluation on 31 subjects across different interaction behaviors shows that training instances identified by the PACE data collection have higher gaze point-interaction cue consistency than those identified by conventional approaches. An in-situ study using real-life tasks on a diverse set of interactive applications demonstrates that the PACE gaze estimation achieves an average error of 2.56º, which is comparable to state-of-the-art, but without the need for explicit training or calibration. This demonstrates the effectiveness of both the gaze estimation method and the corresponding data collection mechanism.

[1]  Ieee Xplore,et al.  IEEE Transactions on Pattern Analysis and Machine Intelligence Information for Authors , 2022, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[2]  Ryen W. White,et al.  No clicks, no problem: using cursor movements to understand and improve search , 2011, CHI.

[3]  Anthony J. Hornof,et al.  Easy post-hoc spatial recalibration of eye tracking data , 2014, ETRA.

[4]  Stephen Chi-fai Chan,et al.  Building a Self-Learning Eye Gaze Model from User Interaction Data , 2014, ACM Multimedia.

[5]  Simon Lucey,et al.  Face alignment through subspace constrained mean-shifts , 2009, 2009 IEEE 12th International Conference on Computer Vision.

[6]  Takahiro Okabe,et al.  Head pose-free appearance-based gaze sensing via eye image synthesis , 2012, Proceedings of the 21st International Conference on Pattern Recognition (ICPR2012).

[7]  Eugene Agichtein,et al.  Towards predicting web searcher gaze position from mouse movements , 2010, CHI Extended Abstracts.

[8]  Kerry Rodden,et al.  Eye-mouse coordination patterns on web search results pages , 2008, CHI Extended Abstracts.

[9]  Ryen W. White,et al.  User see, user point: gaze and cursor alignment in web search , 2012, CHI.

[10]  Robert J. K. Jacob,et al.  What you look at is what you get: eye movement-based interaction techniques , 1990, CHI '90.

[11]  John R. Anderson,et al.  What can a mouse cursor tell us more?: correlation of eye/mouse movements on web browsing , 2001, CHI Extended Abstracts.

[12]  Zhiwei Zhu,et al.  Nonlinear Eye Gaze Mapping Function Estimation via Support Vector Regression , 2006, 18th International Conference on Pattern Recognition (ICPR'06).

[13]  B. A. Shenoi,et al.  Introduction to Digital Signal Processing and Filter Design , 2005 .

[14]  Yunfeng Zhang,et al.  Mode-of-disparities Error Correction of Eye-tracking Data , 2011 .

[15]  Qiang Ji,et al.  In the Eye of the Beholder: A Survey of Models for Eyes and Gaze , 2010, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[16]  Takahiro Okabe,et al.  Learning gaze biases with head motion for head pose-free gaze estimation , 2014, Image Vis. Comput..

[17]  Tim Halverson,et al.  Cleaning up systematic error in eye-tracking data by using required fixation locations , 2002, Behavior research methods, instruments, & computers : a journal of the Psychonomic Society, Inc.

[18]  Andreas Bulling,et al.  EyeTab: model-based gaze estimation on unmodified tablet computers , 2014, ETRA.

[19]  Timothy F. Cootes,et al.  Active Appearance Models , 2001, IEEE Trans. Pattern Anal. Mach. Intell..

[20]  Susan T. Dumais,et al.  Gaze and mouse coordination in everyday work , 2014, UbiComp Adjunct.

[21]  Andrew Blake,et al.  Sparse and Semi-supervised Visual Mapping with the S^3GP , 2006, 2006 IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR'06).

[22]  Oleg V. Komogortsev,et al.  Can we beat the mouse with MAGIC? , 2013, CHI.

[23]  Leo Breiman,et al.  Random Forests , 2001, Machine Learning.

[24]  Yoichi Sato,et al.  Appearance-Based Gaze Estimation Using Visual Saliency , 2013, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[25]  Theo Gevers,et al.  Calibration-Free Gaze Estimation Using Human Gaze Patterns , 2013, 2013 IEEE International Conference on Computer Vision.

[26]  Frédo Durand,et al.  Learning to predict where humans look , 2009, 2009 IEEE 12th International Conference on Computer Vision.

[27]  Takahiro Okabe,et al.  Inferring human gaze from appearance via adaptive linear regression , 2011, 2011 International Conference on Computer Vision.

[28]  Yoichi Sato,et al.  An Incremental Learning Method for Unconstrained Gaze Estimation , 2008, ECCV.

[29]  Fernando De la Torre,et al.  Supervised Descent Method and Its Applications to Face Alignment , 2013, 2013 IEEE Conference on Computer Vision and Pattern Recognition.