EyeTell: Tablet-based Calibration-free Eye-typing using Smooth-pursuit movements

Gaze tracking technology, with the increasingly robust and lightweight equipment, can have tremendous applications. To use the technology during short interactions, such as in public displays or hospitals to communicate non-verbally after a surgery, the application needs to be intuitive without requiring a calibration. Gaze gestures such as smooth-pursuit eye movements can be detected without calibration. We report the working performance of a calibration-free eye-typing application using only the front-facing camera of a tablet. In a user study with 29 participants, we obtained an average typing speed of 1.27 WPM after four trials and a maximum typing speed of 1.95 WPM.

[1]  Matthias Rötting,et al.  A text entry interface using smooth pursuit movements and language model , 2018, ETRA.

[2]  Florian Alt,et al.  TextPursuits: using text for pursuits-based interaction and calibration on public displays , 2016, UbiComp.

[3]  G. Pfurtscheller,et al.  Information transfer rate in a five-classes brain-computer interface , 2001, IEEE Transactions on Neural Systems and Rehabilitation Engineering.

[4]  Meredith Ringel Morris,et al.  Improving Dwell-Based Gaze Typing with Dynamic, Cascading Dwell Times , 2017, CHI.

[5]  Otto Hans-Martin Lutz,et al.  SMOOVS: Towards calibration-free text entry by gaze using smooth pursuit movements , 2015 .

[6]  David R. Beukelman,et al.  Eye-gaze access to AAC technology for people with amyotrophic lateral sclerosis , 2010 .

[7]  Oleg Spakov,et al.  Fast gaze typing with an adjustable dwell time , 2009, CHI.

[8]  Matthias Roetting,et al.  Entering PIN codes by smooth pursuit eye movements , 2014 .

[9]  Jacob O. Wobbrock,et al.  Longitudinal evaluation of discrete consecutive gaze gestures for text entry , 2008, ETRA.

[10]  Dennis J. McFarland,et al.  Brain–computer interfaces for communication and control , 2002, Clinical Neurophysiology.

[11]  J. Lorenceau,et al.  Cursive Eye-Writing With Smooth-Pursuit Eye-Movement Is Possible in Subjects With Amyotrophic Lateral Sclerosis , 2019, Front. Neurosci..

[12]  Brad A. Myers,et al.  Analyzing the input stream for character- level errors in unconstrained text entry evaluations , 2006, TCHI.

[13]  Florian Alt,et al.  DialPlates: enabling pursuits-based user interfaces with large target numbers , 2019, MUM.

[14]  Katharina Scheiter,et al.  Scanpath comparison in medical image reading skills of dental students: distinguishing stages of expertise development , 2018, ETRA.

[15]  Keita Higuchi,et al.  CoSummary: adaptive fast-forwarding for surgical videos by detecting collaborative scenes using hand regions and gaze positions , 2019, IUI.

[16]  Nader Pouratian,et al.  Evaluating True BCI Communication Rate through Mutual Information and Language Models , 2013, PloS one.

[17]  Mohamed Khamis,et al.  Calibration-free text entry using smooth pursuit eye movements , 2019, ETRA.

[18]  John Paulin Hansen,et al.  Bringing gaze-based interaction back to basics , 2001, HCI.

[19]  Alan F. Blackwell,et al.  Dasher—a data entry interface using continuous gestures and language models , 2000, UIST '00.

[20]  Ali Maleki,et al.  Accurate estimation of information transfer rate based on symbol occurrence probability in brain-computer interfaces , 2019, Biomed. Signal Process. Control..