Gaze vs. Mouse: A Fast and Accurate Gaze-Only Click Alternative

Eye gaze tracking is a promising input method which is gradually finding its way into the mainstream. An obvious question to arise is whether it can be used for point-and-click tasks, as an alternative for mouse or touch. Pointing with gaze is both fast and natural, although its accuracy is limited. There are still technical challenges with gaze tracking, as well as inherent physiological limitations. Furthermore, providing an alternative to clicking is challenging. We are considering use cases where input based purely on gaze is desired, and the click targets are discrete user interface (UI) elements which are too small to be reliably resolved by gaze alone, e.g., links in hypertext. We present Actigaze, a new gaze-only click alternative which is fast and accurate for this scenario. A clickable user interface element is selected by dwelling on one of a set of confirm buttons, based on two main design contributions: First, the confirm buttons stay on fixed positions with easily distinguishable visual identifiers such as colors, enabling procedural learning of the confirm button position. Secondly, UI elements are associated with confirm buttons through the visual identifiers in a way which minimizes the likelihood of inadvertent clicks. We evaluate two variants of the proposed click alternative, comparing them against the mouse and another gaze-only click alternative.

[1]  D. Ballard,et al.  Eye movements in natural behavior , 2005, Trends in Cognitive Sciences.

[2]  Anke Huckauf,et al.  Pies with EYEs: the limits of hierarchical pie menus in gaze control , 2010, ETRA '10.

[3]  Anke Huckauf,et al.  Object selection in gaze controlled systems: What you don't look at is what you get , 2011, TAP.

[4]  Chris Lankford Effective eye-gaze input into Windows , 2000, ETRA.

[5]  John M. Flach,et al.  Small-target selection with gaze alone , 2010, ETRA '10.

[6]  Christopher G. Healey,et al.  Choosing effective colours for data visualization , 1996, Proceedings of Seventh Annual IEEE Visualization '96.

[7]  Minoru Ohyama,et al.  A system for Web browsing by eye-gaze input , 2008 .

[8]  Christof Lutteroth,et al.  Eyes Only: Navigating Hypertext with Gaze , 2013, INTERACT.

[9]  J. B. Brooke,et al.  SUS: A 'Quick and Dirty' Usability Scale , 1996 .

[10]  Hans-Werner Gellersen,et al.  Pursuit calibration: making gaze calibration less tedious and more flexible , 2013, UIST.

[11]  Tovi Grossman,et al.  The bubble cursor: enhancing target acquisition by dynamic resizing of the cursor's activation area , 2005, CHI.

[12]  Xuan Zhang,et al.  Evaluating Eye Tracking with ISO 9241 - Part 9 , 2007, HCI.

[13]  Christof Lutteroth,et al.  Eyes and Keys: An Evaluation of Click Alternatives Combining Gaze and Keyboard , 2015, INTERACT.

[14]  Christof Lutteroth,et al.  Designing for the eye: design parameters for dwell in gaze interaction , 2012, OZCHI.

[15]  Colin Ware,et al.  An evaluation of an eye tracker as a device for computer input2 , 1987, CHI 1987.

[16]  Fulvio Corno,et al.  Accessible Web Surfing through gaze interaction , 2007 .

[17]  Eyal M. Reingold,et al.  Selection By Looking: A Novel Computer Interface And Its Application To Psychological Research , 1995 .

[18]  Terry Winograd,et al.  GUIDe: gaze-enhanced UI design , 2007, CHI Extended Abstracts.

[19]  Anke Huckauf,et al.  On object selection in gaze controlled environments , 2008 .

[20]  Robert J. K. Jacob,et al.  Evaluation of eye gaze interaction , 2000, CHI.

[21]  Andreas Paepcke,et al.  EyePoint: practical pointing and selection using gaze and keyboard , 2007, CHI.

[22]  M. Land Eye movements and the control of actions in everyday life , 2006, Progress in Retinal and Eye Research.

[23]  Päivi Majaranta,et al.  Twenty years of eye typing: systems and design issues , 2002, ETRA.

[24]  Margrit Betke,et al.  Communication via eye blinks and eyebrow raises: video-based human-computer interfaces , 2003, Universal Access in the Information Society.

[25]  Andrew T. Duchowski,et al.  Efficient eye pointing with a fisheye lens , 2005, Graphics Interface.

[26]  I. Scott MacKenzie An eye on input: research challenges in using the eye for computer input control , 2010, ETRA '10.

[27]  C. Ware,et al.  An evaluation of an eye tracker as a device for computer input2 , 1987, CHI '87.

[28]  Albrecht Schmidt,et al.  Interacting with the Computer Using Gaze Gestures , 2007, INTERACT.

[29]  Takehiko Ohno Features of eye gaze interface for selection tasks , 1998, Proceedings. 3rd Asia Pacific Computer Human Interaction (Cat. No.98EX110).

[30]  John Paulin Hansen,et al.  Single gaze gestures , 2010, ETRA '10.

[31]  Shumin Zhai,et al.  Manual and gaze input cascaded (MAGIC) pointing , 1999, CHI '99.

[32]  Michael Ortega-Binderberger,et al.  Rake cursor: improving pointing performance with concurrent input channels , 2009, CHI.