Noise tolerant selection by gaze-controlled pan and zoom in 3D

This paper presents StarGazer - a new 3D interface for gaze-based interaction and target selection using continuous pan and zoom. Through StarGazer we address the issues of interacting with graph structured data and applications (i.e. gaze typing systems) using low resolution eye trackers or small-size displays. We show that it is possible to make robust selection even with a large number of selectable items on the screen and noisy gaze trackers. A test with 48 subjects demonstrated that users who have never tried gaze interaction before could rapidly adapt to the navigation principles of StarGazer. We tested three different display sizes (down to PDA-sized displays) and found that large screens are faster to navigate than small displays and that the error rate is higher for the smallest display. Half of the subjects were exposed to severe noise deliberately added on the cursor positions. We found that this had a negative impact on efficiency. However, the user remained in control and the noise did not seem to effect the error rate. Additionally, three subjects tested the effects of temporally adding noise to simulate latency in the gaze tracker. Even with a significant latency (about 200 ms) the subjects were able to type at acceptable rates. In a second test, seven subjects were allowed to adjust the zooming speed themselves. They achieved typing rates of more than eight words per minute without using language modeling. We conclude that the StarGazer application is an intuitive 3D interface for gaze navigation, allowing more selectable objects to be displayed on the screen than the accuracy of the gaze trackers would otherwise permit.

[1]  Päivi Majaranta,et al.  Twenty years of eye typing: systems and design issues , 2002, ETRA.

[2]  Dan Witzner Hansen,et al.  Eye tracking in the wild , 2005, Comput. Vis. Image Underst..

[3]  Richard A. Bolt,et al.  Gaze-orchestrated dynamic windows , 1981, SIGGRAPH '81.

[4]  John Paulin Hansen,et al.  Gaze typing compared with input by head and hand , 2004, ETRA.

[5]  I. Scott MacKenzie,et al.  Effects of feedback and dwell time on eye typing speed and accuracy , 2006, Universal Access in the Information Society.

[6]  Howell O. Istance,et al.  Zooming interfaces!: enhancing the performance of eye controlled pointing devices , 2002, Assets '02.

[7]  I. Scott MacKenzie,et al.  Eye gaze interaction with expanding targets , 2004, CHI EA '04.

[8]  Chris Lankford Effective eye-gaze input into Windows , 2000, ETRA.

[9]  B. Schneirdeman,et al.  Designing the User Interface: Strategies for Effective Human-Computer Interaction , 1998 .

[10]  Andrew T. Duchowski,et al.  Efficient eye pointing with a fisheye lens , 2005, Graphics Interface.

[11]  I. Scott MacKenzie,et al.  A character-level error analysis technique for evaluating text entry methods , 2002, NordiCHI '02.

[12]  Patrick W. Jordan,et al.  An Introduction to Usability , 1998 .

[13]  David J. Ward,et al.  Fast Hands-free Writing by Gaze Direction , 2002, ArXiv.

[14]  Ben Shneiderman,et al.  Designing the user interface (2nd ed.): strategies for effective human-computer interaction , 1992 .

[15]  Rafael Cabeza,et al.  Eye tracking: Pupil orientation geometrical modeling , 2006, Image Vis. Comput..

[16]  Laura Chamberlain Eye Tracking Methodology; Theory and Practice , 2007 .

[17]  Steven P. Reiss,et al.  Stretching the rubber sheet: a metaphor for viewing large layouts on small screens , 1993, UIST '93.

[18]  Roel Vertegaal,et al.  EyeWindows: evaluation of eye-controlled zooming windows for focus selection , 2005, CHI.

[19]  Moshe Eizenman,et al.  General theory of remote gaze estimation using the pupil center and corneal reflections , 2006, IEEE Transactions on Biomedical Engineering.

[20]  Naoki Mukawa,et al.  A free-head, simple calibration, gaze tracking system that enables gaze-based interaction , 2004, ETRA.

[21]  William Buxton,et al.  The limits of expert performance using hierarchic marking menus , 1993, INTERCHI.

[22]  Dongheng Li,et al.  openEyes: a low-cost head-mounted eye-tracking solution , 2006, ETRA.

[23]  Yu-Te Wu,et al.  A calibration-free gaze tracking technique , 2000, Proceedings 15th International Conference on Pattern Recognition. ICPR-2000.