Eye gaze tracking for human computer interaction

With a growing number of computer devices around us, and the increasing time we spend for interacting with such devices, we are strongly interested in finding new interaction methods which ease the use of computers or increase interaction efficiency. Eye tracking seems to be a promising technology to achieve this goal. This thesis researches interaction methods based on eye-tracking technology. After a discussion of the limitations of the eyes regarding accuracy and speed, including a general discussion on Fitts’ law, the thesis follows three different approaches on how to utilize eye tracking for computer input. The first approach researches eye gaze as pointing device in combination with a touch sensor for multimodal input and presents a method using a touch sensitive mouse. The second approach examines people’s ability to perform gestures with the eyes for computer input and the separation of gaze gestures from natural eye movements. The third approach deals with the information inherent in the movement of the eyes and its application to assist the user. The thesis presents a usability tool for recording of interaction and gaze activity. It also describes algorithms for reading detection. All approaches present results based on user studies conducted with prototypes developed for the purpose.

[1]  G. G. Stokes "J." , 1890, The New Yale Book of Quotations.

[2]  R. Dodge,et al.  The angular velocity of eye movements , 1901 .

[3]  Paul M. Fitts,et al.  Eye movements of aircraft pilots during instrument-landing approaches. , 1950 .

[4]  P. Fitts The information capacity of the human motor system in controlling the amplitude of movement. , 1954, Journal of experimental psychology.

[5]  Parlette Gn 25 years later. , 1976, Journal of environmental health.

[6]  Stuart K. Card,et al.  Evaluation of mouse, rate-controlled isometric joystick, step keys, and text keys, for text selection on a CRT , 1987 .

[7]  Richard A. Bolt,et al.  Gaze-orchestrated dynamic windows , 1981, SIGGRAPH '81.

[8]  Allen Newell,et al.  The psychology of human-computer interaction , 1983 .

[9]  Irenäus Eibl-Eibesfeldt Die Biologie des menschlichen Verhaltens. Grundriß der Humanethologie , 1984 .

[10]  C. Ware,et al.  An evaluation of an eye tracker as a device for computer input2 , 1987, CHI '87.

[11]  Colin Ware,et al.  An evaluation of an eye tracker as a device for computer input2 , 1987, CHI 1987.

[12]  R. Lathe Phd by thesis , 1988, Nature.

[13]  E. Kośmicki Irenaus Eibl-Eibesfeldt, “Die Biologie des menschlichen Verhaltens”. Grundriss der Humanethologie, Munchen – Zurich 1984, Piper, ss. 998. , 1988, Anthropological Review.

[14]  I. MacKenzie,et al.  A note on the information-theoretic basis of Fitts' law. , 1989, Journal of motor behavior.

[15]  R A Abrams,et al.  Speed and accuracy of saccadic eye movements: characteristics of impulse variability in the oculomotor system. , 1989, Journal of experimental psychology. Human perception and performance.

[16]  Robert J. K. Jacob,et al.  What you look at is what you get: eye movement-based interaction techniques , 1990, CHI '90.

[17]  R. Jacob The use of eye movements in human-computer interaction techniques: what you look at is what you get , 1991, TOIS.

[18]  Abigail Sellen,et al.  A comparison of input devices in element pointing and dragging tasks , 1991, CHI.

[19]  I. Scott MacKenzie,et al.  Movement time prediction in human-computer interfaces , 1992 .

[20]  David Goldberg,et al.  Touch-typing with a stylus , 1993, INTERCHI.

[21]  Francis K. H. Quek Eyes in the interface , 1995, Image Vis. Comput..

[22]  Robert J. K. Jacob,et al.  Eye tracking in advanced interface design , 1995 .

[23]  H. Deubel,et al.  Saccade target selection and object recognition: Evidence for a common attentional mechanism , 1996, Vision Research.

[24]  Richard Hull,et al.  Towards situated computing , 1997, Digest of Papers. First International Symposium on Wearable Computers.

[25]  Erik D. Reichle,et al.  Toward a model of eye movement control in reading. , 1998, Psychological review.

[26]  S. Crawford,et al.  Volume 1 , 2012, Journal of Diabetes Investigation.

[27]  Gregory D. Abowd,et al.  Cirrin: a word-level unistroke keyboard for pen input , 1998, UIST '98.

[28]  Shumin Zhai,et al.  Manual and gaze input cascaded (MAGIC) pointing , 1999, CHI '99.

[29]  Mike Sinclair,et al.  Touch-sensing input devices , 1999, CHI '99.

[30]  Arnon Amir,et al.  Framerate pupil detector and gaze tracker , 1999, ICCV 1999.

[31]  Takeo Kanade,et al.  Dual-state parametric eye tracking , 2000, Proceedings Fourth IEEE International Conference on Automatic Face and Gesture Recognition (Cat. No. PR00580).

[32]  Robert J. K. Jacob,et al.  Evaluation of eye gaze interaction , 2000, CHI.

[33]  John L. Sibert,et al.  The reading assistant: eye gaze triggered auditory prompting for reading remediation , 2000, UIST '00.

[34]  Joseph H. Goldberg,et al.  Identifying fixations and saccades in eye-tracking protocols , 2000, ETRA.

[35]  Scott D. Brown,et al.  The power law repealed: The case for an exponential law of practice , 2000, Psychonomic bulletin & review.

[36]  Robert J. K. Jacob,et al.  Interacting with eye movements in virtual environments , 2000, CHI.

[37]  Carlos Hitoshi Morimoto,et al.  Pupil detection and tracking using multiple light sources , 2000, Image Vis. Comput..

[38]  John R. Anderson,et al.  Intelligent gaze-added interfaces , 2000, CHI.

[39]  Ilse Ravyse,et al.  Eye activity detection and recognition using morphological scale-space decomposition , 2000, Proceedings 15th International Conference on Pattern Recognition. ICPR-2000.

[40]  Poika Isokoski,et al.  Text input methods for eye trackers using off-screen targets , 2000, ETRA.

[41]  Yu-Te Wu,et al.  A calibration-free gaze tracking technique , 2000, Proceedings 15th International Conference on Pattern Recognition. ICPR-2000.

[42]  Paul P. Maglio,et al.  A robust algorithm for reading detection , 2001, PUI '01.

[43]  Robert J. K. Jacob,et al.  Evaluation and Analysis of Eye Gaze Interaction , 2001 .

[44]  Shumin Zhai,et al.  Chinese input with keyboard and eye-tracking: an anatomical study , 2001, CHI.

[45]  Anton Nijholt,et al.  Eye gaze patterns in conversations: there is more to conversational agents than meets the eyes , 2001, CHI.

[46]  Myron Flickner,et al.  Differences in the infrared bright pupil response of human eyes , 2002, ETRA.

[47]  Carlos Hitoshi Morimoto,et al.  Free head motion eye gaze tracking without calibration , 2002, CHI Extended Abstracts.

[48]  Ravin Balakrishnan,et al.  Acquisition of expanding targets , 2002, CHI.

[49]  Roel Vertegaal,et al.  Designing attentive cell phone using wearable eyecontact sensors , 2002, CHI Extended Abstracts.

[50]  Naoki Mukawa,et al.  FreeGaze: a gaze tracking system for everyday gaze interaction , 2002, ETRA.

[51]  Päivi Majaranta,et al.  Twenty years of eye typing: systems and design issues , 2002, ETRA.

[52]  Ohno Takehiko A gaze tracking system for everyday gaze interaction , 2002 .

[53]  Nico Thomae,et al.  HOUGH-TRANSFORMATION ZUR BILDVERARBEITUNG BEI DER BLICKRICHTUNGSBESTIMMUNG , 2002 .

[54]  Zhiwei Zhu,et al.  Real-time eye detection and tracking under various light conditions , 2002, ETRA.

[55]  Brad A. Myers,et al.  EdgeWrite: a stylus-based text entry method designed for high accuracy and stability of motion , 2003, UIST '03.

[56]  Oleg Spakov,et al.  Symbol Creator: An Alternative Eye-based Text Entry Technique with Low Demand for Screen Space , 2003, INTERACT.

[57]  Nicholas J Wade,et al.  Dodge-Ing the Issue: Dodge, Javal, Hering, and the Measurement of Saccades in Eye-Movement Research , 2003, Perception.

[58]  Shumin Zhai,et al.  Human on-line response to target expansion , 2003, CHI '03.

[59]  Robert J. K. Jacob,et al.  Eye tracking in human-computer interaction and usability research : Ready to deliver the promises , 2002 .

[60]  Albrecht Schmidt,et al.  Ubiquitous computing - computing in context , 2003 .

[61]  Y. V. Venkatesh,et al.  Eye gaze based reading detection , 2003, TENCON 2003. Conference on Convergent Technologies for Asia-Pacific Region.

[62]  Slavko Milekic The More You Look the More You Get: Intention-Based Interface Using Gaze-Tracking. , 2003 .

[63]  I. Scott MacKenzie,et al.  Auditory and visual feedback during eye typing , 2003, CHI Extended Abstracts.

[64]  Brian P. Bailey,et al.  Using Eye Gaze Patterns to Identify User Tasks , 2004 .

[65]  Naoki Mukawa,et al.  A free-head, simple calibration, gaze tracking system that enables gaze-based interaction , 2004, ETRA.

[66]  Manfred Tscheligi,et al.  CHI '04 Extended Abstracts on Human Factors in Computing Systems , 2004, CHI 2004.

[67]  Päivi Majaranta,et al.  Effects of feedback on eye typing with a short dwell time , 2004, ETRA.

[68]  Mads Nielsen,et al.  Eye tracking off the shelf , 2004, ETRA.

[69]  Jeffrey S. Shell,et al.  ECSGlasses and EyePliances: using attention to open sociable windows of interaction , 2004, ETRA.

[70]  Shumin Zhai,et al.  Characterizing computer input with Fitts' law parameters-the information and non-information aspects of pointing , 2004, Int. J. Hum. Comput. Stud..

[71]  Bing Pan,et al.  The determinants of web page viewing behavior: an eye-tracking study , 2004, ETRA.

[72]  E. Schneider Torsionelle Augenbewegungen bei galvanischen, natürlichen und pathologischen Reizzuständen des menschlichen Gleichgewichtssystems , 2004 .

[73]  I. Scott MacKenzie,et al.  Eye gaze interaction with expanding targets , 2004, CHI EA '04.

[74]  Kentaro Go,et al.  Resolving ambiguities of a gaze and speech interface , 2004, ETRA.

[75]  Anthony J. Hornof,et al.  EyeDraw: a system for drawing pictures with the eyes , 2004, CHI EA '04.

[76]  Roel Vertegaal,et al.  EyeWindows: evaluation of eye-controlled zooming windows for focus selection , 2005, CHI.

[77]  Roel Vertegaal,et al.  eyeLook: using attention to facilitate mobile media consumption , 2005, UIST '05.

[78]  Andrew T. Duchowski,et al.  Efficient eye pointing with a fisheye lens , 2005, Graphics Interface.

[79]  Oleg Spakov,et al.  Gaze-based selection of standard-size menu items , 2005, ICMI '05.

[80]  Roel Vertegaal,et al.  Media eyepliances: using eye tracking for remote control focus selection of appliances , 2005, CHI Extended Abstracts.

[81]  Julie E. Boland,et al.  Cultural variation in eye movements during scene perception. , 2005, Proceedings of the National Academy of Sciences of the United States of America.

[82]  David Beymer,et al.  WebGazeAnalyzer: a system for capturing and analyzing web reading behavior using eye gaze , 2005, CHI Extended Abstracts.

[83]  Dongheng Li,et al.  Starburst: A hybrid algorithm for video-based eye tracking combining feature-based and model-based approaches , 2005, 2005 IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR'05) - Workshops.

[84]  Shumin Zhai,et al.  Conversing with the user based on eye-gaze patterns , 2005, CHI.

[85]  Peter D. Lawrence,et al.  A single camera eye-gaze tracking system with free head motion , 2006, ETRA.

[86]  Ralf Engbert,et al.  Microsaccades are triggered by low retinal image slip. , 2006, Proceedings of the National Academy of Sciences of the United States of America.

[87]  Klaus Bartl,et al.  A pivotable head mounted camera system that is aligned by three-dimensional eye movements , 2006, ETRA.

[88]  David Salesin,et al.  Gaze-based interaction for semi-automatic photo cropping , 2006, CHI.

[89]  Cristina Conati,et al.  Eye-tracking to model and adapt to user meta-cognition in intelligent learning environments , 2006, IUI '06.

[90]  M. Land Eye movements and the control of actions in everyday life , 2006, Progress in Retinal and Eye Research.

[91]  Brad A. Myers,et al.  Trackball text entry for people with motor impairments , 2006, CHI.

[92]  Albrecht Schmidt,et al.  Mauszeigerpositionierung mit dem Auge , 2006, Mensch & Computer.

[93]  Dongheng Li,et al.  openEyes: a low-cost head-mounted eye-tracking solution , 2006, ETRA.

[94]  Albrecht Schmidt,et al.  Knowing the User's Every Move – User Activity Tracking for Website Usability Evaluation and Implicit Interaction , 2006 .

[95]  Klaus Bartl,et al.  Mobile eye tracking as a basis for real-time control of a gaze driven head-mounted video camera , 2006, ETRA '06.

[96]  Andreas Paepcke,et al.  EyePoint: practical pointing and selection using gaze and keyboard , 2007, CHI.

[97]  Albrecht Schmidt,et al.  Detailed Monitoring of User's Gaze and Interaction to Improve Future E-Learning , 2007, HCI.

[98]  Albrecht Schmidt,et al.  Keystroke-level model for advanced mobile phone interaction , 2007, CHI.

[99]  Albrecht Schmidt,et al.  Interacting with the Computer Using Gaze Gestures , 2007, INTERACT.

[100]  Albrecht Schmidt,et al.  Eye-gaze interaction for mobile phones , 2007, Mobility '07.

[101]  Tal Garfinkel,et al.  Reducing shoulder-surfing by using gaze-based password entry , 2007, SOUPS '07.

[102]  Albrecht Schmidt,et al.  A Proxy-Based Infrastructure for Web Application Sharing and Remote Collaboration on Web Pages , 2007, INTERACT.

[103]  Albrecht Schmidt,et al.  Blickgesten als Fernbedienung , 2007, Mensch & Computer.

[104]  Xuan Zhang,et al.  Evaluating Eye Tracking with ISO 9241 - Part 9 , 2007, HCI.

[105]  Jacob O. Wobbrock,et al.  Not Typing but Writing: Eye-based Text Entry Using Letter-like Gestures , 2007 .

[106]  Aude Billard,et al.  Calibration-Free Eye Gaze Direction Detection with Gaussian Processes , 2008, VISAPP.

[107]  Elisabeth André,et al.  Writing with Your Eye: A Dwell Time Free Writing System Adapted to the Nature of Human Eye Gaze , 2008, PIT.

[108]  Marco Porta,et al.  Eye-S: a full-screen input modality for pure eye-based communication , 2008, ETRA.

[109]  Gerhard Tröster,et al.  EyeMote - Towards Context-Aware Gaming Using Eye Movements Recorded from Wearable Electrooculography , 2008, Fun and Games.

[110]  Richard Atterer Usability tool support for model-based web development , 2008 .

[111]  Roel Vertegaal A Fitts Law comparison of eye tracking and manual input in the selection of visual targets , 2008, ICMI '08.

[112]  Gerhard Tröster,et al.  It’s in Your Eyes - Towards Context-Awareness and Mobile HCI Using Wearable EOG Goggles , 2008 .

[113]  Gerhard Tröster,et al.  Robust Recognition of Reading Activity in Transit Using Wearable Electrooculography , 2009, Pervasive.

[114]  Jacob O. Wobbrock,et al.  Longitudinal evaluation of discrete consecutive gaze gestures for text entry , 2008, ETRA.

[115]  Albrecht Schmidt,et al.  The MAGIC Touch: Combining MAGIC-Pointing with a Touch-Sensitive Mouse , 2009, INTERACT.

[116]  Tommy Strandvall,et al.  Eye Tracking in Human-Computer Interaction and Usability Research , 2009, INTERACT.