Smart user interface for mobile consumer devices using model-based eye-gaze estimation

A smart user-interface for mobile consumer devices was developed using a robust eye-gaze system without any hand motion. Using one camera and one display already available in popular mobile devices, the eye-gaze system estimates the visual angle, which shows the area of interest on the display to indicate the position of the cursor. Three novel techniques were developed to make the system robust, userindependent, and head/device motion invariant. First, by carefully investigating the geometric relation between the device and the user's cornea, a new algorithm was developed to estimate the cornea center position, which is directly related to the optical axis of the eye. Unlike previous algorithms, it does not utilize the user-dependent cornea radius. Second, to make the system robust for practical application, an algorithm was developed to compensate for imaging position errors due to the finite camera resolution. Third, a binocular algorithm was developed to estimate the user-dependent angular offsets between the optical and visual axes with only single point calibration. The proposed system was demonstrated to be accurate enough for many practical mobile user interfaces.

[1]  Kang Ryoung Park,et al.  New computer interface combining gaze tracking and brainwave measurements , 2011, IEEE Transactions on Consumer Electronics.

[2]  Kang Ryoung Park,et al.  Gaze tracking system at a distance for controlling IPTV , 2010, IEEE Transactions on Consumer Electronics.

[3]  Sheng-Wen Shih,et al.  A novel approach to 3-D gaze tracking using stereo cameras , 2004, IEEE Trans. Syst. Man Cybern. Part B.

[4]  Carlos Hitoshi Morimoto,et al.  Eye gaze tracking techniques for interactive applications , 2005, Comput. Vis. Image Underst..

[5]  Moshe Eizenman,et al.  General theory of remote gaze estimation using the pupil center and corneal reflections , 2006, IEEE Transactions on Biomedical Engineering.

[6]  Bernhard P. Wrobel,et al.  Multiple View Geometry in Computer Vision , 2001 .

[7]  Kang Ryoung Park,et al.  A realistic game system using multi-modal user interfaces , 2010, IEEE Transactions on Consumer Electronics.

[8]  Naoki Tanaka,et al.  User-Calibration-Free Gaze Estimation Method Using a Binocular 3D Eye Model , 2011, IEICE Trans. Inf. Syst..

[9]  K. Preston White,et al.  Spatially dynamic calibration of an eye-tracking system , 1993, IEEE Trans. Syst. Man Cybern..

[10]  Rafael Cabeza,et al.  Gaze Tracking System Model Based on Physical Parameters , 2007, Int. J. Pattern Recognit. Artif. Intell..

[11]  Rafael Cabeza,et al.  Eye tracking: Pupil orientation geometrical modeling , 2006, Image Vis. Comput..

[12]  Xiaohui Yang,et al.  Gaze tracking based on similarity between spatial triangles and two-stage calibration , 2011 .

[13]  Peter M. Corcoran,et al.  Real-time eye gaze tracking for gaming design and consumer electronics systems , 2012, IEEE Transactions on Consumer Electronics.

[14]  Rafael Cabeza,et al.  A Novel Gaze Estimation System With One Calibration Point , 2008, IEEE Transactions on Systems, Man, and Cybernetics, Part B (Cybernetics).

[15]  Peter D. Lawrence,et al.  A non-contact device for tracking gaze in a human computer interface , 2005, Comput. Vis. Image Underst..

[16]  Moshe Eizenman,et al.  User-calibration-free remote gaze estimation system , 2010, ETRA.

[17]  Qiang Ji,et al.  In the Eye of the Beholder: A Survey of Models for Eyes and Gaze , 2010, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[18]  Moshe Eizenman,et al.  Remote point-of-gaze estimation requiring a single-point calibration for applications with infants , 2008, ETRA.

[19]  Myung Jin Chung,et al.  A novel non-intrusive eye gaze estimation using cross-ratio under large head motion , 2005, Comput. Vis. Image Underst..

[20]  Jian-Gang Wang,et al.  Estimating the eye gaze from one eye , 2005, Comput. Vis. Image Underst..