Gaze-based assistive technologies

The eyes play an important role both in perception and communication. Technical interfaces that make use of their versatility can bring significant improvements to those who are unable to speak or to handle selection tasks elsewise such as with their hands, feet, noses or tools handled with the mouth. Using the eyes to enter texts into a computer system, which is called gaze-typing, is the most prominent gaze-based assistive technology. The article reviews the principles of eye movements, presents an overview of current eye-tracking systems, and discusses several approaches to gaze-typing. With the recent advent of mobile eye-tracking systems, gaze-based assistive technology is no longer restricted to interactions with desktop-computers. Gaze-based assistive technology is ready to expand its application into other areas of everyday life. The second part of the article thus discusses the use of gaze-based assistive technology in the household, or “the wild,” outside one’s own four walls.

[1]  Felix Hülsmann,et al.  Comparing gaze-based and manual interaction in a fast-paced gaming task in Virtual Reality , 2011 .

[2]  R.J.K. Jacob,et al.  Hot topics-eye-gaze computer interfaces: what you look at is what you get , 1993, Computer.

[3]  Gregory D. Abowd,et al.  Cirrin: a word-level unistroke keyboard for pen input , 1998, UIST '98.

[4]  A. L. Yarbus,et al.  Eye Movements and Vision , 1967, Springer US.

[5]  Thies Pfeiffer,et al.  JVideoGazer - Towards an Automatic Annotation of Gaze Videos from Natural Scenes , 2011 .

[6]  Thies Pfeiffer,et al.  Understanding multimodal deixis with gaze and gesture in conversational interfaces , 2011 .

[7]  D. S. Wooding,et al.  Fixation maps: quantifying eye-movement traces , 2002, ETRA.

[8]  Toon Goedemé,et al.  Towards a more effective method for analyzing mobile eye-tracking data: integrating gaze data with object recognition algorithms , 2011, PETMEI '11.

[9]  David Robinson,et al.  The oculomotor control system: A review , 1968 .

[10]  Anthony J. Hornof,et al.  EyeDraw: enabling children with severe motor impairments to draw with their eyes , 2005, CHI.

[11]  L. Stark,et al.  Scanpaths in Eye Movements during Pattern Perception , 1971, Science.

[12]  Stijn De Beugher,et al.  Automatic analysis of eye-tracking data using object detection algorithms , 2012, UbiComp '12.

[13]  David J. Ward,et al.  Artificial intelligence: Fast hands-free writing by gaze direction , 2002, Nature.

[14]  Boris M. Velichkovsky,et al.  Auf dem Weg zur Blickmaus: Die Beeinflussung der Fixationsdauer durch kognitive und kommunikative Aufgaben , 1997, Software-Ergonomie.

[15]  Karin Harbusch,et al.  Towards an adaptive communication aid with text input from ambiguous keyboards , 2003 .

[16]  M. Betke,et al.  The Camera Mouse: visual tracking of body features to provide computer access for people with severe disabilities , 2002, IEEE Transactions on Neural Systems and Rehabilitation Engineering.

[17]  Poika Isokoski,et al.  Text input methods for eye trackers using off-screen targets , 2000, ETRA.

[18]  Brad A. Myers,et al.  EdgeWrite: a stylus-based text entry method designed for high accuracy and stability of motion , 2003, UIST '03.

[19]  Andrew T. Duchowski,et al.  PETMEI 2011: the 1st international workshop on pervasive eye tracking and mobile eye-based interaction , 2011, UbiComp '11.

[20]  Ipke Wachsmuth,et al.  An Operational Model of Joint Attention - Timing of Gaze Patterns in Interactions between Humans and a Virtual Human , 2012, CogSci.

[21]  Robert J. K. Jacob,et al.  Evaluation of eye gaze interaction , 2000, CHI.

[22]  Albrecht Schmidt,et al.  Eye-gaze interaction for mobile phones , 2007, Mobility '07.

[23]  Andreas Bulling,et al.  Eye gesture recognition on portable devices , 2012, UbiComp '12.

[24]  Jon Driver,et al.  Adult's Eyes Trigger Shifts of Visual Attention in Human Infants , 1998 .

[25]  Albrecht Schmidt,et al.  Blickgesten als Fernbedienung , 2007, Mensch & Computer.

[26]  Bob Barrett Creating Protective Barriers for Students with Disabilities in E-Learning Environments , 2014 .

[27]  Laura Chamberlain Eye Tracking Methodology; Theory and Practice , 2007 .

[28]  Worthy N. Martin,et al.  Human-computer interaction using eye-gaze input , 1989, IEEE Trans. Syst. Man Cybern..

[29]  Roberto Caldelli,et al.  Electronic Voting by Means of Digital Terrestrial Television: The Infrastructure, Security Issues and a Real Test-Bed , 2009, I3E.

[30]  R. W. Ditchburn,et al.  Involuntary eye movements during fixation , 1953, The Journal of physiology.

[31]  Jacob O. Wobbrock,et al.  Longitudinal evaluation of discrete consecutive gaze gestures for text entry , 2008, ETRA.

[32]  Alan Kennedy,et al.  Book Review: Eye Tracking: A Comprehensive Guide to Methods and Measures , 2016, Quarterly journal of experimental psychology.

[33]  H. Ritter,et al.  Disambiguating Complex Visual Information: Towards Communication of Personal Views of a Scene , 1996, Perception.

[34]  G. Sharmila Sujatha CAMERA MOUSE , .

[35]  Thies Pfeiffer,et al.  Benefits of Locating Overt Visual Attention in Space Using Binocular Eye Tracking for Mixed Reality Applications , 2009, Mensch & Computer Workshopband.

[36]  B. Velichkovsky,et al.  Eye typing in application: A comparison of two interfacing systems with ALS patients , 2008 .

[37]  Richard A. Bolt,et al.  Gaze-orchestrated dynamic windows , 1981, SIGGRAPH '81.

[38]  C. Moore,et al.  The origins of joint visual attention in infants. , 1998, Developmental psychology.

[39]  Thies Pfeiffer,et al.  Towards Continuous Gaze-Based Interaction in 3D Environments - Unobtrusive Calibration and Accuracy Monitoring , 2011 .

[40]  G. Butterworth,et al.  How the eyes, head and hand serve definite reference , 2000 .

[41]  Peter Olivieri,et al.  EagleEyes: An Eye Control System for Persons with Disabilities , 2013 .

[42]  A. L. I︠A︡rbus Eye Movements and Vision , 1967 .

[43]  Anke Huckauf,et al.  On object selection in gaze controlled environments , 2008 .

[44]  L. Young,et al.  Survey of eye movement recording methods , 1975 .

[45]  Andreas Bulling,et al.  3rd International Workshop on Pervasive Eye Tracking and Mobile Eye-based Interaction , 2012 .

[46]  J Bozell,et al.  The Midas touch. , 2000, Nursing management.

[47]  Kevin Larson,et al.  Speech Error Correction: The Story of the Alternates List , 2003, Int. J. Speech Technol..

[48]  Daniel B. Horn,et al.  Patterns of entry and correction in large vocabulary continuous speech recognition systems , 1999, CHI '99.

[49]  John Paulin Hansen,et al.  Noise tolerant selection by gaze-controlled pan and zoom in 3D , 2008, ETRA.

[50]  Thies Pfeiffer Towards Gaze Interaction in Immersive Virtual Reality: Evaluation of a Monocular Eye Tracking Set-Up , 2008 .

[51]  Thies Pfeiffer Measuring and visualizing attention in space with 3D attention volumes , 2012, ETRA '12.

[52]  D. Robinson,et al.  A METHOD OF MEASURING EYE MOVEMENT USING A SCLERAL SEARCH COIL IN A MAGNETIC FIELD. , 1963, IEEE transactions on bio-medical engineering.

[53]  Anke Huckauf,et al.  Gazing with pEYE: new concepts in eye typing , 2007, APGV.

[54]  Marc Erich Latoschik,et al.  Evaluation of Binocular Eye Trackers and Algorithms for 3D Gaze Interaction in Virtual Reality Environments , 2008, J. Virtual Real. Broadcast..

[55]  Keith Vertanen,et al.  Speech dasher: fast writing using speech and gaze , 2010, CHI.

[56]  Bert Bongers,et al.  Interactivating Rehabilitation through Active Multimodal Feedback and Guidance , 2011 .

[57]  Dongheng Li,et al.  openEyes: a low-cost head-mounted eye-tracking solution , 2006, ETRA.

[58]  David J. Ward,et al.  Fast Hands-free Writing by Gaze Direction , 2002, ArXiv.

[59]  Robert J. K. Jacob,et al.  What you look at is what you get: Using eye movements as computer input , 1993 .