Real-time eye-gaze estimation by using a virtual reference point

A human interface that uses a line of sight has been studied. In previous studies, some methods use a special light source or a hardware device and others use a software program for image processing for detecting the line. In this paper we propose a new eye-gaze input method. The method uses only a single facial image acquired under ordinary lighting condition, and it requires no laborious camera calibration. A state of eye-gaze is estimated by a virtual reference point, which is determined based on a motion of feature points on a face. We demonstrate the validity of the new method by simulation and experiments.

[1]  Worthy N. Martin,et al.  Human-computer interaction using eye-gaze input , 1989, IEEE Trans. Syst. Man Cybern..

[2]  Jin Liu,et al.  An Experimental Multimedia System Allowing 3-D Visualization and Eye-Controlled Interaction Without User-Worn Devices , 1999, IEEE Trans. Multim..

[3]  Michael Isard,et al.  CONDENSATION—Conditional Density Propagation for Visual Tracking , 1998, International Journal of Computer Vision.

[4]  Alexander Zelinsky,et al.  An algorithm for real-time stereo vision implementation of head pose and gaze direction measurement , 2000, Proceedings Fourth IEEE International Conference on Automatic Face and Gesture Recognition (Cat. No. PR00580).

[5]  Jian-Gang Wang,et al.  On Eye Gaze Determination via Iris Contour , 2000, MVA.

[6]  Tetsuo Miyake,et al.  Eye-gaze estimation by using features irrespective of face direction , 2005, Systems and Computers in Japan.

[7]  S. Horihata,et al.  Image based eye-gaze estimation irrespective of head direction , 2002, Industrial Electronics, 2002. ISIE 2002. Proceedings of the 2002 IEEE International Symposium on.

[8]  Tetsuo Miyake,et al.  Eye-gaze estimation by using features irrespective of face direction , 2005 .