Development of an Eye-Tracking Pen Display for Analyzing Embodied Interaction

In recent times, intuitive user interfaces such as the touch panel and pen display have become widely used in PCs and PDAs. Previously, the authors developed the bright pupil camera. They subsequently developed an eye-tracking pen display based on this camera and a new aspherical model of the eye. In this paper, a robust gaze estimation method that uses a integrated-light-source camera is proposed for analyzing embodied interaction. Then, a prototype of the eye-tracking pen display was developed. The accuracy of the system was approximately 12 mm on a 15" pen display, which is sufficient for human interaction support.

[1]  Naoki Tanaka,et al.  3D gaze tracking with easy calibration using stereo cameras for robot and human communication , 2008, RO-MAN 2008 - The 17th IEEE International Symposium on Robot and Human Interactive Communication.

[2]  Daniel Vogel,et al.  Hand occlusion with tablet-sized direct pen input , 2009, CHI.

[3]  Sheng-Wen Shih,et al.  A novel approach to 3-D gaze tracking using stereo cameras , 2004, IEEE Trans. Syst. Man Cybern. Part B.

[4]  Tomio Watanabe,et al.  Development of an embodied interaction system with InterActor by speech and hand motion input , 2005, ROMAN 2005. IEEE International Workshop on Robot and Human Interactive Communication, 2005..

[5]  Tomio Watanabe,et al.  Development of eye-tracking pen display based on stereo bright pupil technique , 2010, ETRA '10.

[6]  Hiroshi Sato,et al.  MobiGaze: development of a gaze interface for handheld mobile devices , 2010, CHI EA '10.

[7]  Naoki Tanaka,et al.  Gaze estimation method based on an aspherical model of the cornea: surface of revolution about the optical axis of the eye , 2010, ETRA.

[8]  Tomio Watanabe,et al.  Development of eye-tracking tabletop interface for media art works , 2010, ITS '10.

[9]  Takehiko Ohno,et al.  One-point calibration gaze tracking method , 2006, ETRA.

[10]  Naoki Tanaka,et al.  One-point calibration gaze tracking based on eyeball kinematics using stereo cameras , 2008, ETRA.

[11]  M. Eizenman,et al.  Remote Point-of-Gaze Estimation with Free Head Movements Requiring a Single-Point Calibration , 2007, 2007 29th Annual International Conference of the IEEE Engineering in Medicine and Biology Society.

[12]  Qiang Ji,et al.  A robust 3D eye gaze tracking system using noise reduction , 2008, ETRA.