A study on text entry methods based on eye gestures

Purpose – The purpose of this paper is to consider the two main existing text input techniques based on “eye gestures” – namely EyeWrite and Eye-S – and compare them to each other and to the traditional “virtual keyboard” approach. Design/methodology/approach – The study primarily aims to assess user performance at the very beginning of the learning process. However, a partial longitudinal evaluation is also provided. Two kinds of experiments have been implemented involving 14 testers. Findings – Results show that while the virtual keyboard is faster, EyeWrite and Eye-S are also appreciated and can be viable alternatives (after a proper training period). Practical implications – Writing methods based on eye gestures deserve special attention, as they require less screen space and need limited tracking precision. This study highlights the fact that gesture-based techniques imply a greater initial effort, and require proper training not only to gain knowledge of eye interaction per se, but also for learning...

[1]  Geraldine Fitzpatrick,et al.  Exploratory prototypes for video: interpreting PD for a complexly disabled participant , 2006, NordiCHI '06.

[2]  Brad A. Myers,et al.  EdgeWrite: a stylus-based text entry method designed for high accuracy and stability of motion , 2003, UIST '03.

[3]  Alan F. Blackwell,et al.  Dasher—a data entry interface using continuous gestures and language models , 2000, UIST '00.

[4]  Rabab K. Ward,et al.  The Design of a Point-and-Click System by Integrating a Self-Paced Brain–Computer Interface With an Eye-Tracker , 2011, IEEE Journal on Emerging and Selected Topics in Circuits and Systems.

[5]  Jacob O. Wobbrock,et al.  Longitudinal evaluation of discrete consecutive gaze gestures for text entry , 2008, ETRA.

[6]  I. Scott MacKenzie,et al.  Graffiti vs. unistrokes: an empirical comparison , 2008, CHI.

[7]  Päivi Majaranta,et al.  CHAPTER 9 – Text Entry by Gaze: Utilizing Eye Tracking , 2007 .

[8]  Poika Isokoski,et al.  Text input methods for eye trackers using off-screen targets , 2000, ETRA.

[9]  Jacob O. Wobbrock,et al.  Beyond QWERTY: augmenting touch screen keyboards with multi-touch gestures for non-alphanumeric input , 2012, CHI.

[10]  Gitte Lindgaard,et al.  What is this evasive beast we call user satisfaction? , 2003, Interact. Comput..

[11]  Päivi Majaranta,et al.  Twenty years of eye typing: systems and design issues , 2002, ETRA.

[12]  Girijesh Prasad,et al.  Designing a virtual keyboard with multi-modal access for people with disabilities , 2011, 2011 World Congress on Information and Communication Technologies.

[13]  I. Scott MacKenzie,et al.  Metrics for text entry research: an evaluation of MSD and KSPC, and a new unified error metric , 2003, CHI '03.

[14]  Vittorio Fuccella,et al.  Gestures and widgets: performance in text editing on multi-touch capable mobile devices , 2013, CHI.

[15]  Oleg Spakov,et al.  Symbol Creator: An Alternative Eye-based Text Entry Technique with Low Demand for Screen Space , 2003, INTERACT.

[16]  John Paulin Hansen,et al.  Noise tolerant selection by gaze-controlled pan and zoom in 3D , 2008, ETRA.

[17]  Andy Cockburn,et al.  A flick in the right direction: a case study of gestural input , 2005, Behav. Inf. Technol..

[18]  I. Scott MacKenzie,et al.  BlinkWrite2: an improved text entry method using eye blinks , 2010, ETRA '10.

[19]  Naoki Mukawa,et al.  Gaze-Based Interaction for Anyone, Anytime , 2003 .

[20]  Raafat George Saadé,et al.  First impressions last a lifetime: effect of interface type on disorientation and cognitive load , 2007, Comput. Hum. Behav..

[21]  Jong-Seung Park,et al.  The Indirect Keyboard Control System by Using the Gaze Tracing Based on Haar Classifier in OpenCV , 2009, 2009 International Forum on Information Technology and Applications.

[22]  Peter Olivieri,et al.  EagleEyes: An Eye Control System for Persons with Disabilities , 2013 .

[23]  Andrew T. Duchowski,et al.  Eye Tracking Methodology - Theory and Practice, Third Edition , 2003 .

[24]  Starr Roxanne Hiltz,et al.  User satisfaction with computer-mediated communication systems , 1990 .

[25]  Marco Porta,et al.  Eye-S: a full-screen input modality for pure eye-based communication , 2008, ETRA.

[26]  Keith Vertanen,et al.  Speech dasher: fast writing using speech and gaze , 2010, CHI.

[27]  Slavko Milekic The More You Look the More You Get: Intention-Based Interface Using Gaze-Tracking. , 2003 .

[28]  K. Potosnak Create the best first impression (user interfaces) , 1990, IEEE Software.

[29]  Jacob O. Wobbrock,et al.  Not Typing but Writing: Eye-based Text Entry Using Letter-like Gestures , 2007 .