Swipe&Switch: Text Entry Using Gaze Paths and Context Switching

Swipe-based methods for text entry by gaze allow users to swipe through the letters of a word by gaze, analogous to how they can swipe with a finger on a touchscreen keyboard. Two challenges for these methods are: (1) gaze paths do not possess clear start and end positions, and (2) it is difficult to design text editing features. We introduce Swipe&Switch, a text-entry interface that uses swiping and switching to improve gaze-based interaction. The interface contains three context regions, and detects the start/end of a gesture and emits text editing commands (e.g., word insertion, deletion) when a user switches focus between these regions. A user study showed that Swipe&Switch provides a better user experience and higher text entry rate over a baseline, EyeSwipe.

[1]  Hai-Ning Liang,et al.  RingText: Dwell-free and hands-free Text Entry for Mobile Head-Mounted Displays using Head Motions , 2019, IEEE Transactions on Visualization and Computer Graphics.

[2]  Päivi Majaranta,et al.  Twenty years of eye typing: systems and design issues , 2002, ETRA.

[3]  Marco Porta,et al.  Eye-S: a full-screen input modality for pure eye-based communication , 2008, ETRA.

[4]  Anke Huckauf,et al.  Gazing with pEYE: new concepts in eye typing , 2007, APGV.

[5]  Jacob O. Wobbrock,et al.  Longitudinal evaluation of discrete consecutive gaze gestures for text entry , 2008, ETRA.

[6]  Shumin Zhai,et al.  SHARK2: a large vocabulary shorthand writing system for pen-based computers , 2004, UIST '04.

[7]  Oleg Spakov,et al.  On-line adjustment of dwell time for target selection by gaze , 2004, NordiCHI '04.

[8]  Per Ola Kristensson,et al.  The potential of dwell-free eye-typing for fast assistive gaze communication , 2012, ETRA.

[9]  John Paulin Hansen,et al.  Noise tolerant selection by gaze-controlled pan and zoom in 3D , 2008, ETRA.

[10]  Poika Isokoski,et al.  Text input methods for eye trackers using off-screen targets , 2000, ETRA.

[11]  Maria da Graça Campos Pimentel,et al.  Filteryedping: Design Challenges and User Performance of Dwell-Free Eye Typing , 2015, TACC.

[12]  Meredith Ringel Morris,et al.  Improving Dwell-Based Gaze Typing with Dynamic, Cascading Dwell Times , 2017, CHI.

[13]  Alan F. Blackwell,et al.  Dasher—a data entry interface using continuous gestures and language models , 2000, UIST '00.

[14]  Carlos Hitoshi Morimoto,et al.  Context switching for fast key selection in text entry applications , 2010, ETRA.

[15]  Oleg Spakov,et al.  Fast gaze typing with an adjustable dwell time , 2009, CHI.

[16]  Carlos Hitoshi Morimoto,et al.  AugKey: Increasing Foveal Throughput in Eye Typing with Augmented Keys , 2016, CHI.

[17]  I. Scott MacKenzie,et al.  Phrase sets for evaluating text entry techniques , 2003, CHI Extended Abstracts.

[18]  Robert J. K. Jacob,et al.  What you look at is what you get: eye movement-based interaction techniques , 1990, CHI '90.

[19]  Margrit Betke,et al.  EyeSwipe: Dwell-free Text Entry Using Gaze Paths , 2016, CHI.

[20]  Sayan Sarcar,et al.  EyeK: an efficient dwell-free eye gaze-based text entry system , 2013, APCHI.