GazeEverywhere: Enabling Gaze-only User Interaction on an Unmodified Desktop PC in Everyday Scenarios

Eye tracking is becoming more and more affordable, and thus gaze has the potential to become a viable input modality for human-computer interaction. We present the GazeEverywhere solution that can replace the mouse with gaze control by adding a transparent layer on top of the system GUI. It comprises three parts: i) the SPOCK interaction method that is based on smooth pursuit eye movements and does not suffer from the Midas touch problem; ii) an online recalibration algorithm that continuously improves gaze-tracking accuracy using the SPOCK target projections as reference points; and iii) an optional hardware setup utilizing head-up display technology to project superimposed dynamic stimuli onto the PC screen where a software modification of the system is not feasible. In validation experiments, we show that GazeEverywhere's throughput according to ISO 9241-9 was improved over dwell time based interaction methods and nearly reached trackpad level. Online recalibration reduced interaction target ('button') size by about 25%. Finally, a case study showed that users were able to browse the internet and successfully run Wikirace using gaze only, without any plug-ins or other modifications.

[1]  Hans-Werner Gellersen,et al.  Pursuits: spontaneous interaction with displays based on smooth pursuit eye movement and moving targets , 2013, UbiComp.

[2]  Jacob O. Wobbrock,et al.  The effects of task dimensionality, endpoint deviation, throughput calculation, and experiment design on pointing measures and models , 2011, CHI.

[3]  Maria da Graça Campos Pimentel,et al.  Filteryedping: A Dwell-Free Eye Typing Technique , 2015, CHI Extended Abstracts.

[4]  I. Scott MacKenzie,et al.  Camera Mouse + ClickerAID: Dwell vs. Single-Muscle Click Actuation in Mouse-Replacement Interfaces , 2015, HCI.

[5]  John Paulin Hansen,et al.  Eye Movements in Gaze Interaction , 2013 .

[6]  Rhona Flin,et al.  Eye movements in surgery : A literature review , 2013 .

[7]  Niels Henze,et al.  The Effect of Focus Cues on Separation of Information Layers , 2016, CHI.

[8]  Andrew T. Duchowski,et al.  A rotary dial for gaze-based PIN entry , 2016, ETRA.

[9]  Christof Lutteroth,et al.  Gaze vs. Mouse: A Fast and Accurate Gaze-Only Click Alternative , 2015, UIST.

[10]  Anke Huckauf,et al.  Object selection in gaze controlled systems: What you don't look at is what you get , 2011, TAP.

[11]  David J. Ward,et al.  Fast Hands-free Writing by Gaze Direction , 2002, ArXiv.

[12]  Thomas Martinetz,et al.  Gaze beats mouse: A case study , 2007 .

[13]  Xuan Zhang,et al.  Evaluating Eye Tracking with ISO 9241 - Part 9 , 2007, HCI.

[14]  Albrecht Schmidt,et al.  Interacting with the Computer Using Gaze Gestures , 2007, INTERACT.

[15]  Chris Lankford Effective eye-gaze input into Windows , 2000, ETRA.

[16]  Elisabeth André,et al.  Writing with Your Eye: A Dwell Time Free Writing System Adapted to the Nature of Human Eye Gaze , 2008, PIT.

[17]  Christof Lutteroth,et al.  Designing for the eye: design parameters for dwell in gaze interaction , 2012, OZCHI.

[18]  John M. Flach,et al.  Small-target selection with gaze alone , 2010, ETRA '10.

[19]  D. Shepard A two-dimensional interpolation function for irregularly-spaced data , 1968, ACM National Conference.

[20]  Gerhard Rigoll,et al.  SPOCK: A Smooth Pursuit Oculomotor Control Kit , 2016, CHI Extended Abstracts.

[21]  Hyunjin Ahn,et al.  DOWELL: Dwell-time Based Smartphone Control Solution for People with Upper Limb Disabilities , 2015, CHI Extended Abstracts.

[22]  Boris M. Velichkovsky,et al.  Influences of dwell time and cursor control on the performance in gaze driven typing , 2008 .

[23]  Benjamin W. Tatler,et al.  Systematic tendencies in scene viewing , 2008 .

[24]  Oleg Spakov,et al.  Fast gaze typing with an adjustable dwell time , 2009, CHI.

[25]  谢启堂,et al.  Head-up display , 2014 .

[26]  Päivi Majaranta,et al.  Effects of feedback on eye typing with a short dwell time , 2004, ETRA.

[27]  Jan-Louis Kruger,et al.  Attention distribution and cognitive load in a subtitled academic lecture: L1 vs. L2 , 2014 .

[28]  John Paulin Hansen,et al.  Single stroke gaze gestures , 2009, CHI Extended Abstracts.

[29]  Robert J. K. Jacob,et al.  What you look at is what you get: eye movement-based interaction techniques , 1990, CHI '90.

[30]  Matthias Roetting,et al.  Entering PIN codes by smooth pursuit eye movements , 2014 .

[31]  I. Scott MacKenzie An eye on input: research challenges in using the eye for computer input control , 2010, ETRA '10.

[32]  Sarah A. Douglas,et al.  Testing pointing device performance and user assessment with the ISO 9241, Part 9 standard , 1999, CHI '99.

[33]  David J. Ward,et al.  Artificial intelligence: Fast hands-free writing by gaze direction , 2002, Nature.

[34]  Hans-Werner Gellersen,et al.  Orbits: Gaze Interaction for Smart Watches using Smooth Pursuit Eye Movements , 2015, UIST.

[35]  John R. Anderson,et al.  Intelligent gaze-added interfaces , 2000, CHI.