Improving Dwell-Based Gaze Typing with Dynamic, Cascading Dwell Times

We present cascading dwell gaze typing, a novel approach to dwell-based eye typing that dynamically adjusts the dwell time of keys in an on-screen keyboard based on the likelihood that a key will be selected next, and the location of the key on the keyboard. Our approach makes unlikely keys more difficult to select and likely keys easier to select by increasing and decreasing their required dwell times, respectively. To maintain a smooth typing rhythm for the user, we cascade the dwell time of likely keys, slowly decreasing the minimum allowable dwell time as a user enters text. Cascading the dwell time affords users the benefits of faster dwell times while causing little disruption to users' typing cadence. Results from a longitudinal study with 17 non-disabled participants show that our dynamic cascading dwell technique was significantly faster than a static dwell approach. Participants were able to achieve typing speeds of 12.39 WPM on average with our cascading technique, whereas participants were able to achieve typing speeds of 10.62 WPM on average with a static dwell time approach. In a small evaluation conducted with five people with ALS, participants achieved average typing speeds of 9.51 WPM with our cascading dwell approach. These results show that our dynamic cascading dwell technique has the potential to improve gaze typing for users with and without disabilities.

[1]  R C Littell,et al.  Statistical analysis of repeated measures data using SAS procedures. , 1998, Journal of animal science.

[2]  Carlos Hitoshi Morimoto,et al.  Context switching for fast key selection in text entry applications , 2010, ETRA.

[3]  Sayan Sarcar,et al.  Eyeboard++: an enhanced eye gaze-based text entry system in Hindi , 2013, APCHI.

[4]  Per Ola Kristensson,et al.  The potential of dwell-free eye-typing for fast assistive gaze communication , 2012, ETRA.

[5]  Shumin Zhai,et al.  SHARK2: a large vocabulary shorthand writing system for pen-based computers , 2004, UIST '04.

[6]  J. Lou,et al.  Purposes of AAC device use for persons with ALS as reported by caregivers , 2006, Augmentative and alternative communication.

[7]  I. Scott MacKenzie,et al.  Eye typing using word and letter prediction and a fixation algorithm , 2008, ETRA.

[8]  Kumiko Tanaka-Ishii,et al.  Text Entry Systems: Mobility, Accessibility, Universality , 2007 .

[9]  Jonathan T. Grudin,et al.  Error Patterns in Novice and Skilled Transcription Typing , 1983 .

[10]  Brad A. Myers,et al.  EdgeWrite: a stylus-based text entry method designed for high accuracy and stability of motion , 2003, UIST '03.

[11]  Marco Porta,et al.  Eye-S: a full-screen input modality for pure eye-based communication , 2008, ETRA.

[12]  Päivi Majaranta,et al.  Twenty years of eye typing: systems and design issues , 2002, ETRA.

[13]  Jacob O. Wobbrock,et al.  Longitudinal evaluation of discrete consecutive gaze gestures for text entry , 2008, ETRA.

[14]  G. Ekman Weber's Law and Related Functions , 1959 .

[15]  Poika Isokoski,et al.  Text input methods for eye trackers using off-screen targets , 2000, ETRA.

[16]  Maria da Graça Campos Pimentel,et al.  Filteryedping: Design Challenges and User Performance of Dwell-Free Eye Typing , 2015, TACC.

[17]  I. Scott MacKenzie,et al.  BlinkWrite2: an improved text entry method using eye blinks , 2010, ETRA '10.

[18]  Päivi Majaranta,et al.  Effects of feedback on eye typing with a short dwell time , 2004, ETRA.

[19]  Brad A. Myers,et al.  Analyzing the input stream for character- level errors in unconstrained text entry evaluations , 2006, TCHI.

[20]  Shumin Zhai,et al.  The word-gesture keyboard: reimagining keyboard interaction , 2012, CACM.

[21]  Sayan Sarcar,et al.  EyeBoard: A fast and accurate eye gaze-based text entry system , 2012, 2012 4th International Conference on Intelligent Human Computer Interaction (IHCI).

[22]  I. Scott MacKenzie,et al.  Metrics for text entry research: an evaluation of MSD and KSPC, and a new unified error metric , 2003, CHI '03.

[23]  Oleg Spakov,et al.  Fast gaze typing with an adjustable dwell time , 2009, CHI.

[24]  Margrit Betke,et al.  EyeSwipe: Dwell-free Text Entry Using Gaze Paths , 2016, CHI.

[25]  Poika Isokoski,et al.  Now Dasher! Dash away!: longitudinal study of fast text entry by Eye Gaze , 2008, ETRA.

[26]  Shumin Zhai,et al.  Shorthand writing on stylus keyboard , 2003, CHI '03.

[27]  I. Scott MacKenzie,et al.  Effects of feedback and dwell time on eye typing speed and accuracy , 2006, Universal Access in the Information Society.

[28]  Xiaoyu Zhao,et al.  Typing with eye-gaze and tooth-clicks , 2012, ETRA.

[29]  Oleg Spakov,et al.  On-line adjustment of dwell time for target selection by gaze , 2004, NordiCHI '04.

[30]  I. Scott MacKenzie,et al.  Phrase sets for evaluating text entry techniques , 2003, CHI Extended Abstracts.

[31]  K. Salter,et al.  The ART test of interaction: a robust and powerful rank test of interaction in factorial models , 1993 .

[32]  Alan F. Blackwell,et al.  Dasher—a data entry interface using continuous gestures and language models , 2000, UIST '00.

[33]  Shumin Zhai,et al.  Shapewriter on the iphone: from the laboratory to the real world , 2009, CHI Extended Abstracts.

[34]  S. Hart,et al.  Development of NASA-TLX (Task Load Index): Results of Empirical and Theoretical Research , 1988 .

[35]  Jacob O. Wobbrock,et al.  Measures of Text Entry Performance , 2007 .

[36]  Ernst Heinrich Weber,et al.  De pulsu, resorptione, auditu et tactu. Annotationes anatomicae et physiologicae , 1834 .

[37]  J. J. Higgins,et al.  The aligned rank transform for nonparametric factorial analyses using only anova procedures , 2011, CHI.

[38]  Anke Huckauf,et al.  Gazing with pEYEs: towards a universal input for various applications , 2008, ETRA.

[39]  Kari-Jouko Räihä,et al.  An exploratory study of eye typing fundamentals: dwell time, text entry rate, errors, and workload , 2012, CHI.

[40]  John Paulin Hansen,et al.  Noise tolerant selection by gaze-controlled pan and zoom in 3D , 2008, ETRA.

[41]  Karl M Newell,et al.  The movement speed-accuracy relation in space-time. , 2013, Human movement science.

[42]  J. Treviranus,et al.  A predictive selection technique for single-digit typing with a visual keyboard , 1994 .

[43]  Kari-Jouko Räihä Life in the Fast Lane: Effect of Language and Calibration Accuracy on the Speed of Text Entry by Gaze , 2015, INTERACT.

[44]  Sayan Sarcar,et al.  EyeK: an efficient dwell-free eye gaze-based text entry system , 2013, APCHI.