BlinkWrite: efficient text entry using eye blinks

In this paper, a new text entry system is pro- posed, implemented, and evaluated. BlinkWrite provides a communication gateway for cognitively able motor- impaired individuals who cannot use a traditional eye- tracking system. In contrast to most hands-free systems, BlinkWrite allows text to be entered and corrected using a single input modality: blinks. The system was implemented using a scanning ambiguous keyboard, a new form of scanning keyboard that allows English text to be entered in less than two scanning intervals per character. In a user study, 12 participants entered text using the system with three settings for scanning interval: 1,000, 850, and 700 ms. An average text entry rate of 4.8 wpm was observed with accuracy (97%. The highest average text entry rate was achieved with the scanning interval of 850 ms.

[1]  Anke Huckauf,et al.  Gazing with pEYEs: towards a universal input for various applications , 2008, ETRA.

[2]  Julio Miró,et al.  Text Entry System Based on a Minimal Scan Matrix for Severely Physically Handicapped People , 2008, ICCHP.

[3]  Melanie Baljko,et al.  Indirect text entry using one or two keys , 2006, Assets '06.

[4]  H H Koester,et al.  Adaptive one-switch row-column scanning. , 1999, IEEE transactions on rehabilitation engineering : a publication of the IEEE Engineering in Medicine and Biology Society.

[5]  Karin Harbusch,et al.  An Evaluation Study of Two–Button Scanning with Ambiguous Keyboards , 2004 .

[6]  H H Koester,et al.  Learning and performance of able-bodied individuals using scanning systems with and without word prediction. , 1994, Assistive technology : the official journal of RESNA.

[7]  Roope Raisamo,et al.  Device independent text input: a rationale and an example , 2000, AVI '00.

[8]  Gregory W. Lesher,et al.  Techniques for augmenting scanning communication , 1998 .

[9]  Marco Porta,et al.  Eye-S: a full-screen input modality for pure eye-based communication , 2008, ETRA.

[10]  Horabail S. Venkatagiri Efficient keyboard layouts for sequential access in augmentative and alternative communication , 1999 .

[11]  Yao-Ming Yeh,et al.  Designing a Scanning On-Screen Keyboard for People with Severe Motor Disabilities , 2008, ICCHP.

[12]  I. Scott MacKenzie,et al.  Phrase sets for evaluating text entry techniques , 2003, CHI Extended Abstracts.

[13]  Robert J. K. Jacob,et al.  What you look at is what you get: eye movement-based interaction techniques , 1990, CHI '90.

[14]  I. Scott MacKenzie,et al.  A character-level error analysis technique for evaluating text entry methods , 2002, NordiCHI '02.

[15]  Päivi Majaranta,et al.  Twenty years of eye typing: systems and design issues , 2002, ETRA.

[16]  I. Scott MacKenzie,et al.  Eye typing using word and letter prediction and a fixation algorithm , 2008, ETRA.

[17]  Jacob O. Wobbrock,et al.  Longitudinal evaluation of discrete consecutive gaze gestures for text entry , 2008, ETRA.

[18]  Keisuke Nakano,et al.  Computing the Cost of Typechecking of Composition of Macro Tree Transducers , 2009 .

[19]  Samit Bhattacharya,et al.  User errors on scanning keyboards: Empirical study, model and design principles , 2008, Interact. Comput..

[20]  Andrew T. Duchowski,et al.  Efficient eye pointing with a fisheye lens , 2005, Graphics Interface.

[21]  Margrit Betke,et al.  Communication via eye blinks and eyebrow raises: video-based human-computer interfaces , 2003, Universal Access in the Information Society.

[22]  Alan F. Blackwell,et al.  Dasher—a data entry interface using continuous gestures and language models , 2000, UIST '00.

[23]  Peter E Jones Virtual keyboard with scanning and augmented by prediction , 1998 .

[24]  I. Scott MacKenzie,et al.  Predicting text entry speed on mobile phones , 2000, CHI.

[25]  Yao-Ming Yeh,et al.  A Flexible On-Screen Keyboard: Dynamically Adapting for Individuals' Needs , 2007, HCI.

[26]  Constantine Stephanidis,et al.  An Accessible and Usable Soft Keyboard , 2007, HCI.