Estimation of a focused object using a corneal surface image for eye-based interaction

Researchers are considering the use of eye tracking in head-mounted camera systems, such as Google’s Project Glass. Typical methods require detailed calibration in advance, but long periods of use disrupt the calibration record between the eye and the scene camera. In addition, the focused object might not be estimated even if the point-of-regard is estimated using a portable eye-tracker. Therefore, we propose a novel method for estimating the object that a user is focused upon, where an eye camera captures the reflection on the corneal surface. Eye and environment information can be extracted from the corneal surface image simultaneously. We use inverse ray tracing to rectify the reflected image and a scale-invariant feature transform to estimate the object where the point-of-regard is located. Unwarped images can also be generated continuously from corneal surface images. We consider that our proposed method could be applied to a guidance system and we confirmed the feasibility of this application in experiments that estimated the object focused upon and the point-of-regard.

[1]  Vincent Lepetit,et al.  BRIEF: Binary Robust Independent Elementary Features , 2010, ECCV.

[2]  Atsushi Nakazawa,et al.  Display-camera calibration using eye reflections and geometry constraints , 2011, Comput. Vis. Image Underst..

[3]  Jun Rekimoto,et al.  Aided eyes: eye activity sensing for daily life , 2010, AH.

[4]  G. Buchsbaum A spatial processor model for object colour perception , 1980 .

[5]  Takehiko Ohno,et al.  One-point calibration gaze tracking method , 2006, ETRA.

[6]  Takeo Kanade,et al.  Automatic acquisition of a 3D eye model for a wearable first-person vision device , 2012, ETRA.

[7]  Robert C. Bolles,et al.  Random sample consensus: a paradigm for model fitting with applications to image analysis and automated cartography , 1981, CACM.

[8]  Gerhard Tröster,et al.  Wearable EOG goggles: Seamless sensing and context-awareness in everyday environments , 2009, J. Ambient Intell. Smart Environ..

[9]  Atsushi Nakazawa,et al.  Super-Resolution from Corneal Images , 2012, BMVC.

[10]  Shree K. Nayar,et al.  The World in an Eye , 2004, CVPR.

[11]  Shree K. Nayar,et al.  Eyes for relighting , 2004, SIGGRAPH 2004.

[12]  Naoki Mukawa,et al.  A free-head, simple calibration, gaze tracking system that enables gaze-based interaction , 2004, ETRA.

[13]  Naoki Tanaka,et al.  One-point calibration gaze tracking based on eyeball kinematics using stereo cameras , 2008, ETRA.

[14]  Carlos Hitoshi Morimoto,et al.  Eye gaze tracking techniques for interactive applications , 2005, Comput. Vis. Image Underst..

[15]  Moshe Eizenman,et al.  General theory of remote gaze estimation using the pupil center and corneal reflections , 2006, IEEE Transactions on Biomedical Engineering.

[16]  Matthijs C. Dorst Distinctive Image Features from Scale-Invariant Keypoints , 2011 .

[17]  Naoki Tanaka,et al.  Calibration-free gaze tracking using a binocular 3D eye model , 2009, CHI Extended Abstracts.