Starburst: A hybrid algorithm for video-based eye tracking combining feature-based and model-based approaches

Knowing the user’s point of gaze has significant potential to enhance current human-computer interfaces, given that eye movements can be used as an indicator of the attentional state of a user. The primary obstacle of integrating eye movements into today’s interfaces is the availability of a reliable, low-cost open-source eye-tracking system. Towards making such a system available to interface designers, we have developed a hybrid eye-tracking algorithm that integrates feature-based and model-based approaches and made it available in an open-source package. We refer to this algorithm as "starburst" because of the novel way in which pupil features are detected. This starburst algorithm is more accurate than pure feature-based approaches yet is signi?cantly less time consuming than pure modelbased approaches. The current implementation is tailored to tracking eye movements in infrared video obtained from an inexpensive head-mounted eye-tracking system. A validation study was conducted and showed that the technique can reliably estimate eye position with an accuracy of approximately one degree of visual angle.

[1]  Jeff B. Pelz,et al.  Building a lightweight eyetracking headgear , 2004, ETRA.

[2]  Irfan A. Essa,et al.  Detecting and tracking eyes by using their physiological properties, dynamics, and appearance , 2000, Proceedings IEEE Conference on Computer Vision and Pattern Recognition. CVPR 2000 (Cat. No.PR00662).

[3]  Bernhard P. Wrobel,et al.  Multiple View Geometry in Computer Vision , 2001 .

[4]  Edward H. Adelson,et al.  A multiresolution spline with application to image mosaics , 1983, TOGS.

[5]  S T Moore,et al.  Robust pupil center detection using a curvature algorithm. , 1999, Computer methods and programs in biomedicine.

[6]  Ernst Niebur,et al.  Variable-Resolution Displays: A Theoretical, Practical, and Behavioral Evaluation , 2002, Hum. Factors.

[7]  Shree K. Nayar,et al.  Eyes for relighting , 2004, ACM Trans. Graph..

[8]  Ernst Niebur,et al.  A feasibility test for perceptually adaptive level of detail rendering on desktop systems , 2004, APGV '04.

[9]  Jeff B. Pelz,et al.  Portable eyetracking: a study of natural eye movements , 2000, Electronic Imaging.

[10]  John Daugman,et al.  High Confidence Visual Recognition of Persons by a Test of Statistical Independence , 1993, IEEE Trans. Pattern Anal. Mach. Intell..

[11]  Jie Zhu,et al.  Subpixel eye gaze tracking , 2002, Proceedings of Fifth IEEE International Conference on Automatic Face Gesture Recognition.

[12]  Päivi Majaranta,et al.  Twenty years of eye typing: systems and design issues , 2002, ETRA.

[13]  Dan Witzner Hansen,et al.  Eye tracking in the wild , 2005, Comput. Vis. Image Underst..

[14]  Dongheng Li,et al.  Towards an open-hardware open-software toolkit for robust low-cost eye tracking in HCI applications , 2005 .

[15]  Naoki Mukawa,et al.  FreeGaze: a gaze tracking system for everyday gaze interaction , 2002, ETRA.

[16]  Dave M. Stampe,et al.  Heuristic filtering and reliable calibration methods for video-based pupil-tracking systems , 1993 .

[17]  L. Young,et al.  Survey of eye movement recording methods , 1975 .

[18]  Robert J. K. Jacob,et al.  Evaluation of eye gaze interaction , 2000, CHI.

[19]  Robert C. Bolles,et al.  Random sample consensus: a paradigm for model fitting with applications to image analysis and automated cartography , 1981, CACM.

[20]  Carlos Hitoshi Morimoto,et al.  Detecting eye position and gaze from a single camera and 2 light sources , 2002, Object recognition supported by user interaction for service robots.