Using Variable Dwell Time to Accelerate Gaze-Based Web Browsing with Two-Step Selection

ABSTRACT In order to avoid the “Midas Touch” problem, gaze-based interfaces for selection often introduce a dwell time: a fixed amount of time the user must fixate upon an object before it is selected. Past interfaces have used a uniform dwell time across all objects. Here, we propose a gaze-based browser using a two-step selection policy with variable dwell time. In the first step, a command (e.g., “back” or “select”) is chosen from a menu using a dwell time that is constant across the different commands. In the second step, if the “select” command is chosen, the user selects a hyperlink using a dwell time that varies between different hyperlinks. We assign shorter dwell times to more likely hyperlinks and longer dwell times to less likely hyperlinks. In order to infer the likelihood each hyperlink will be selected, we have developed a probabilistic model of natural gaze behavior while surfing the web. We have evaluated a number of heuristic and probabilistic methods for varying the dwell times using both simulation and experiment. Our results demonstrate that varying dwell time improves the user experience in comparison with fixed dwell time, resulting in fewer errors and increased speed. While all of the methods for varying dwell time resulted in improved performance, the probabilistic models yielded much greater gains than the simple heuristics. The best performing model reduces error rate by 50% compared to 100ms uniform dwell time while maintaining a similar response time. It reduces response time by 60% compared to 300ms uniform dwell time while maintaining a similar error rate.

[1]  Linden J. Ball,et al.  Eye Tracking in Human-Computer Interaction and Usability Research : Current Status and Future Prospects , 2004 .

[2]  Md. Golam Rashed,et al.  Supporting Human–Robot Interaction Based on the Level of Visual Focus of Attention , 2015, IEEE Transactions on Human-Machine Systems.

[3]  Steffen Staab,et al.  Eye-Controlled Interfaces for Multimedia Interaction , 2016, IEEE MultiMedia.

[4]  Thorsten O. Zander,et al.  Combining Eye Gaze Input With a Brain–Computer Interface for Touchless Human–Computer Interaction , 2010, Int. J. Hum. Comput. Interact..

[5]  Andrew T. Duchowski,et al.  Efficient eye pointing with a fisheye lens , 2005, Graphics Interface.

[6]  Bilge Mutlu,et al.  Anticipatory robot control for efficient human-robot collaboration , 2016, 2016 11th ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[7]  L. Stark,et al.  Scanpaths in saccadic eye movements while viewing and recognizing patterns. , 1971, Vision research.

[8]  Chandan Kumar,et al.  GazeTheWeb: A Gaze-Controlled Web Browser , 2017, W4A.

[9]  Jacob L. Orquin,et al.  Attention and choice: a review on eye movements in decision making. , 2013, Acta psychologica.

[10]  Linden J. Ball,et al.  Eye tracking in HCI and usability research. , 2006 .

[11]  Tevfik Metin Sezgin,et al.  Gaze-based real-time activity recognition for proactive interfaces , 2015, 2015 23nd Signal Processing and Communications Applications Conference (SIU).

[12]  S. Shimojo,et al.  Gaze bias both reflects and influences preference , 2003, Nature Neuroscience.

[13]  Joseph H. Goldberg,et al.  Identifying fixations and saccades in eye-tracking protocols , 2000, ETRA.

[14]  Gunhee Han,et al.  Gaze-Assisted User Intention Prediction for Initial Delay Reduction in Web Video Access , 2015, Sensors.

[15]  Robert J. K. Jacob,et al.  Eye tracking in advanced interface design , 1995 .

[16]  Raja Jurdak,et al.  Gaze dependant prefetching of web content to increase speed and comfort of web browsing , 2015, Int. J. Hum. Comput. Stud..

[17]  Lawrence R. Rabiner,et al.  A tutorial on hidden Markov models and selected applications in speech recognition , 1989, Proc. IEEE.

[18]  Bertram E. Shi,et al.  Hybrid gaze/EEG brain computer interface for robot arm control on a pick and place task , 2015, 2015 37th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC).

[19]  K. Rayner Eye movements in reading and information processing: 20 years of research. , 1998, Psychological bulletin.

[20]  Oleg Spakov,et al.  Fast gaze typing with an adjustable dwell time , 2009, CHI.

[21]  Bertram E. Shi,et al.  Probabilistic adjustment of dwell time for eye typing , 2017, 2017 10th International Conference on Human System Interactions (HSI).

[22]  Michael I. Jordan,et al.  Factorial Hidden Markov Models , 1995, Machine Learning.

[23]  John R. Anderson,et al.  Intelligent gaze-added interfaces , 2000, CHI.

[24]  Andreas Paepcke,et al.  Gaze-enhanced user interface design , 2007 .

[25]  Atsuo Murata Eye‐gaze input versus mouse: Cursor control as a function of age , 2006, Int. J. Hum. Comput. Interact..

[26]  Steffen Staab,et al.  eyeGUI: A Novel Framework for Eye-Controlled User Interfaces , 2016, NordiCHI.

[27]  Päivi Majaranta,et al.  Eye Tracking and Eye-Based Human–Computer Interaction , 2014 .

[28]  Kari-Jouko Räihä,et al.  An exploratory study of eye typing fundamentals: dwell time, text entry rate, errors, and workload , 2012, CHI.

[29]  Meredith Ringel Morris,et al.  Improving Dwell-Based Gaze Typing with Dynamic, Cascading Dwell Times , 2017, CHI.

[30]  Bertram E. Shi,et al.  Hybrid Brain Computer Interface via Bayesian integration of EEG and eye gaze , 2015, 2015 7th International IEEE/EMBS Conference on Neural Engineering (NER).

[31]  Christof Lutteroth,et al.  Gaze vs. Mouse: A Fast and Accurate Gaze-Only Click Alternative , 2015, UIST.