Eyes and Keys: An Evaluation of Click Alternatives Combining Gaze and Keyboard

With eye gaze tracking technology entering the consumer market, there is an increased interest in using it as an input device, similar to the mouse. This holds promise for situations where a typical desk space is not available. While gaze seems natural for pointing, it is inherently inaccurate, which makes the design of fast and accurate methods for clicking targets (“click alternatives”) difficult. We investigate click alternatives that combine gaze with a standard keyboard (“gaze & key click alternatives”) to achieve an experience where the user’s hands can remain on the keyboard all the time. We propose three novel click alternatives (“Letter Assignment”, “Offset Menu” and “Ray Selection”) and present an experiment that compares them with a naive gaze pointing approach (“Gaze & Click”) and the mouse. The experiment uses a randomized, realistic click task in a web browser to collect data about click times and click accuracy, as well as asking users for their preference. Our results indicate that eye gaze tracking is currently too inaccurate for the Gaze & Click approach to work reliably. While Letter Assignment and Offset Menu were usable and a large improvement, they were still significantly slower and less accurate than the mouse.

[1]  Giovanni Spagnoli,et al.  ceCursor, a contextual eye cursor for general pointing in windows environments , 2010, ETRA.

[2]  Anke Huckauf,et al.  Pies with EYEs: the limits of hierarchical pie menus in gaze control , 2010, ETRA '10.

[3]  Maneesh Agrawala,et al.  FingerGlass: efficient multiscale interaction on multitouch screens , 2011, CHI.

[4]  Takehiko Ohno Features of eye gaze interface for selection tasks , 1998, Proceedings. 3rd Asia Pacific Computer Human Interaction (Cat. No.98EX110).

[5]  Hongbin Zha,et al.  Improving eye cursor's stability for eye pointing tasks , 2008, CHI.

[6]  Michael Ortega-Binderberger,et al.  Rake cursor: improving pointing performance with concurrent input channels , 2009, CHI.

[7]  Christof Lutteroth,et al.  Designing for the eye: design parameters for dwell in gaze interaction , 2012, OZCHI.

[8]  Marco Porta,et al.  WeyeB, an eye-controlled Web browser for hands-free navigation , 2009, 2009 2nd Conference on Human System Interactions.

[9]  Robert J. K. Jacob,et al.  Evaluation of eye gaze interaction , 2000, CHI.

[10]  Andreas Paepcke,et al.  EyePoint: practical pointing and selection using gaze and keyboard , 2007, CHI.

[11]  Terry Winograd,et al.  GUIDe: gaze-enhanced UI design , 2007, CHI Extended Abstracts.

[12]  Colin Ware,et al.  An evaluation of an eye tracker as a device for computer input2 , 1987, CHI 1987.

[13]  Christof Lutteroth,et al.  Eyes Only: Navigating Hypertext with Gaze , 2013, INTERACT.

[14]  Shumin Zhai,et al.  Manual and gaze input cascaded (MAGIC) pointing , 1999, CHI '99.

[15]  Howell O. Istance,et al.  Zooming interfaces!: enhancing the performance of eye controlled pointing devices , 2002, Assets '02.

[16]  Armando Barreto,et al.  Integrated electromyogram and eye-gaze tracking cursor control system for computer users with motor disabilities. , 2008, Journal of rehabilitation research and development.

[17]  B. Wandell Foundations of vision , 1995 .

[18]  J. B. Brooke,et al.  SUS: A 'Quick and Dirty' Usability Scale , 1996 .

[19]  Andrew T. Duchowski,et al.  Efficient eye pointing with a fisheye lens , 2005, Graphics Interface.

[20]  Andreas Paepcke,et al.  Improving the accuracy of gaze input for interaction , 2008, ETRA.