HGaze Typing: Head-Gesture Assisted Gaze Typing

This paper introduces a bi-modal typing interface, HGaze Typing, which combines the simplicity of head gestures with the speed of gaze inputs to provide efficient and comfortable dwell-free text entry. HGaze Typing uses gaze path information to compute candidate words and allows explicit activation of common text entry commands, such as selection, deletion, and revision, by using head gestures (nodding, shaking, and tilting). By adding a head-based input channel, HGaze Typing reduces the size of the screen regions for cancel/deletion buttons and the word candidate list, which are required by most eye-typing interfaces. A user study finds HGaze Typing outperforms a dwell-time-based keyboard in efficacy and user satisfaction. The results demonstrate that the proposed method of integrating gaze and head-movement inputs can serve as an effective interface for text entry and is robust to unintended selections.

[1]  Kari-Jouko Räihä,et al.  An exploratory study of eye typing fundamentals: dwell time, text entry rate, errors, and workload , 2012, CHI.

[2]  Tao Feng,et al.  Continuous mobile authentication using a novel Graphic Touch Gesture Feature , 2013, 2013 IEEE Sixth International Conference on Biometrics: Theory, Applications and Systems (BTAS).

[3]  I. Scott MacKenzie,et al.  Phrase sets for evaluating text entry techniques , 2003, CHI Extended Abstracts.

[4]  Margrit Betke,et al.  Blink and wink detection for mouse pointer control , 2010, PETRA '10.

[5]  Margrit Betke,et al.  EyeSwipe: Dwell-free Text Entry Using Gaze Paths , 2016, CHI.

[6]  Chiranjib Bhattacharyya,et al.  Fréchet Distance Based Approach for Searching Online Handwritten Documents , 2007, Ninth International Conference on Document Analysis and Recognition (ICDAR 2007).

[7]  Kathleen F. McCoy,et al.  User Interaction with Word Prediction: The Effects of Prediction Quality , 2009, TACC.

[8]  Jacob O. Wobbrock,et al.  Longitudinal evaluation of discrete consecutive gaze gestures for text entry , 2008, ETRA.

[9]  Marco Porta,et al.  Eye-S: a full-screen input modality for pure eye-based communication , 2008, ETRA.

[10]  Shumin Zhai,et al.  SHARK2: a large vocabulary shorthand writing system for pen-based computers , 2004, UIST '04.

[11]  Robert J. K. Jacob,et al.  What you look at is what you get: eye movement-based interaction techniques , 1990, CHI '90.

[12]  John Paulin Hansen,et al.  Command Without a Click: Dwell Time Typing by Mouse and Gaze Selections , 2003, INTERACT.

[13]  Argenis Ramirez Gomez,et al.  BimodalGaze: Seamlessly Refined Pointing with Gaze and Filtered Gestural Head Movement , 2020, ETRA.

[14]  Per Ola Kristensson,et al.  An evaluation of Dasher with a high-performance language model as a gaze communication method , 2014, AVI.

[15]  Margrit Betke,et al.  Communication via eye blinks and eyebrow raises: video-based human-computer interfaces , 2003, Universal Access in the Information Society.

[16]  H. Hahn Sur quelques points du calcul fonctionnel , 1908 .

[17]  Shumin Zhai,et al.  The word-gesture keyboard: reimagining keyboard interaction , 2012, CACM.

[18]  M. Betke,et al.  The Camera Mouse: visual tracking of body features to provide computer access for people with severe disabilities , 2002, IEEE Transactions on Neural Systems and Rehabilitation Engineering.

[19]  Margrit Betke,et al.  HMAGIC: head movement and gaze input cascaded pointing , 2015, PETRA.

[20]  Oleg Spakov,et al.  Look and lean: accurate head-assisted eye pointing , 2014, ETRA.

[21]  Per Ola Kristensson,et al.  The potential of dwell-free eye-typing for fast assistive gaze communication , 2012, ETRA.

[22]  Oleg Spakov,et al.  Comparison of video-based pointing and selection techniques for hands-free text entry , 2012, AVI.

[23]  Dan Witzner Hansen,et al.  Eye-based head gestures , 2012, ETRA.

[24]  John Paulin Hansen,et al.  Gaze typing compared with input by head and hand , 2004, ETRA.

[25]  Matthew Turk,et al.  Gaze and head pointing for hands-free text entry: applicability to ultra-small virtual keyboards , 2018, ETRA.

[26]  Poika Isokoski,et al.  Text input methods for eye trackers using off-screen targets , 2000, ETRA.

[27]  David J. Ward,et al.  Fast Hands-free Writing by Gaze Direction , 2002, ArXiv.

[28]  Carlos Hitoshi Morimoto,et al.  Dynamic context switching for gaze based interaction , 2012, ETRA.

[29]  Meredith Ringel Morris,et al.  Improving Dwell-Based Gaze Typing with Dynamic, Cascading Dwell Times , 2017, CHI.

[30]  Anke Huckauf,et al.  Gazing with pEYE: new concepts in eye typing , 2007, APGV.

[31]  Oleg Spakov,et al.  Fast gaze typing with an adjustable dwell time , 2009, CHI.

[32]  Mark Billinghurst,et al.  Pinpointing: Precise Head- and Eye-Based Target Selection for Augmented Reality , 2018, CHI.

[33]  Howell O. Istance,et al.  Why are eye mice unpopular? A detailed comparison of head and eye controlled assistive technology pointing devices , 2003, Universal Access in the Information Society.

[34]  Sandi Ljubic,et al.  Integrating Blink Click interaction into a head tracking system: implementation and usability issues , 2013, Universal Access in the Information Society.

[35]  Carlos Hitoshi Morimoto,et al.  Context switching for fast key selection in text entry applications , 2010, ETRA.

[36]  Maria da Graça Campos Pimentel,et al.  Filteryedping: Design Challenges and User Performance of Dwell-Free Eye Typing , 2015, TACC.

[37]  Shumin Zhai,et al.  Shorthand writing on stylus keyboard , 2003, CHI '03.

[38]  Mark Davies The Corpus of Contemporary American English (COCA) , 2012 .

[39]  Germain Forestier,et al.  Unsupervised Trajectory Segmentation for Surgical Gesture Recognition in Robotic Training , 2016, IEEE Transactions on Biomedical Engineering.

[40]  Sayan Sarcar,et al.  EyeK: an efficient dwell-free eye gaze-based text entry system , 2013, APCHI.

[41]  Carlos Hitoshi Morimoto,et al.  AugKey: Increasing Foveal Throughput in Eye Typing with Augmented Keys , 2016, CHI.

[42]  Margrit Betke,et al.  Exploration of Assistive Technologies Used by People with Quadriplegia Caused by Degenerative Neurological Diseases , 2018, Int. J. Hum. Comput. Interact..

[43]  Meredith Ringel Morris,et al.  Smartphone-Based Gaze Gesture Communication for People with Motor Disabilities , 2017, CHI.

[44]  Oleg Spakov,et al.  Enhanced gaze interaction using simple head gestures , 2012, UbiComp.