Commentary on Section 4. Eye tracking in human-computer interaction and usability research: Ready to deliver the promises.

Publisher Summary This chapter discusses the application of eye movements to user interfaces, both for analyzing interfaces (measuring usability) and as an actual control medium within a human–computer dialogue. For usability analysis, the user's eye movements are recorded during system use and later analyzed retrospectively; however, the eye movements do not affect the interface in real time. As a direct control medium, the eye movements are obtained and used in real time as an input to the user–computer dialogue. The eye movements might be the sole input, typically for disabled users or hands-busy applications, or might be used as one of several inputs, combining with mouse, keyboard, sensors, or other devices. From the perspective of mainstream eye-movement research, human–computer interaction, together with related work in the broader field of communications and media research, appears as a new and very promising area of applied work. Both basic and applied work can profit from integration within a unified field of eye­-movement research. Application of eye tracking in human–computer interaction remains a very promising approach; its technological and market barriers are finally being reduced.

[1]  N. Mackworth,et al.  Head-mounted eye-marker camera. , 1962, Journal of the Optical Society of America.

[2]  Robert J. K. Jacob,et al.  Evaluation of eye gaze interaction , 2000, CHI.

[3]  N H MACKWORTH,et al.  Eye fixations recorded on changing visual scenes by the television eye-marker. , 1958, Journal of the Optical Society of America.

[4]  M. Just,et al.  Constructing mental models of machines from text and diagrams. , 1993 .

[5]  Robert W. Reeder,et al.  WebEyeMapper and WebLogger: tools for analyzing eye tracking data collected in web-use studies , 2001, CHI Extended Abstracts.

[6]  R. Walker,et al.  A model of saccade generation based on parallel processing and competitive inhibition , 1999, Behavioral and Brain Sciences.

[7]  J. Pelz,et al.  Oculomotor behavior and perceptual strategies in complex tasks , 2001, Vision Research.

[8]  P A Kolers,et al.  Eye Movement Measurement of Readability of CRT Displays , 1981, Human factors.

[9]  Jeffrey B. Mulligan,et al.  A software-based eye tracking system for the study of air-traffic displays , 2002, ETRA.

[10]  R. A. Fisher,et al.  Progress Report on an Eye-Slaved Area-of-Interest Visual Display, , 1984 .

[11]  W. Levelt,et al.  Pupillary dilation as a measure of attention: a quantitative system analysis , 1993 .

[12]  M Angelborg-Thanderz,et al.  Information complexity--mental workload and performance in combat aircraft. , 1997, Ergonomics.

[13]  S. Vecera,et al.  What are you looking at? Impaired ‘social attention’ following frontal-lobe damage , 2004, Neuropsychologia.

[14]  Keith S. Karn,et al.  Testing for power usability , 1997, CHI Extended Abstracts.

[15]  Jeff B. Pelz,et al.  Extended tasks elicit complex eye movement patterns , 2000, ETRA.

[16]  Robert J. K. Jacob,et al.  DLoVe: using constraints to allow parallel processing in multi-user virtual reality , 2002, Proceedings IEEE Virtual Reality 2002.

[17]  Paul M. Fitts,et al.  Eye movements of aircraft pilots during instrument-landing approaches. , 1950 .

[18]  J. Senders,et al.  Eye Movements and Psychological Processes , 1976 .

[19]  L. Walrath,et al.  Eye movement and pupillary response indices of mental workload during visual search of symbolic displays. , 1992, Applied ergonomics.

[20]  Michael E. Holmes,et al.  Visual attention to repeated internet images: testing the scanpath theory on the world wide web , 2002, ETRA.

[21]  S Yamamoto,et al.  A method of evaluating VDT screen layout by eye movement analysis. , 1992, Ergonomics.

[22]  Eileen Kowler The role of visual and cognitive processes in the control of eye movement. , 1990, Reviews of oculomotor research.

[23]  Kasper Hornbæk,et al.  Measuring usability: are effectiveness, efficiency, and satisfaction really correlated? , 2000, CHI.

[24]  Linden J. Ball,et al.  An Eye Movement Analysis of Web Page Usability , 2002 .

[25]  Robert J. K. Jacob,et al.  A software model and specification language for non-WIMP user interfaces , 1999, TCHI.

[26]  N. Hari Narayanan,et al.  Comparing interfaces based on what users watch and do , 2000, ETRA.

[27]  Colin Ware,et al.  An evaluation of an eye tracker as a device for computer input2 , 1987, CHI 1987.

[28]  A. Inhoff,et al.  Chapter 2 – Definition and Computation of Oculomotor Measures in the Study of Cognitive Processes , 1998 .

[29]  Cornell Juliano,et al.  The hunt for usability: tracking eye movements , 1999, CHI Extended Abstracts.

[30]  Päivi Majaranta,et al.  Design issues of iDICT: a gaze-assisted translation aid , 2000, ETRA.

[31]  Yukio Kobayashi,et al.  A study of human interface using an eye-movement detection system , 1989 .

[32]  J Merchant,et al.  Remote measurement of eye direction allowing subject motion over one cubic foot of space. , 1974, IEEE transactions on bio-medical engineering.

[33]  Mary Hegarty,et al.  The Mechanics of Comprehension and Comprehension of Mechanics , 1992 .

[34]  K. Rayner Eye movements in reading and information processing: 20 years of research. , 1998, Psychological bulletin.

[35]  R A Monty,et al.  An advanced eye-movement measuring and recording system. , 1975, The American psychologist.

[36]  P. Fitts The information capacity of the human motor system in controlling the amplitude of movement. , 1954, Journal of experimental psychology.

[37]  Cleo D. Redline,et al.  Eye-Movement Analysis: A New Tool for Evaluating the Design of Visually Administered Instruments (Paper and Web) By , 2001 .

[38]  M. Sanders,et al.  Eye Movements and the Higher Psychological Functions , 1979 .

[39]  M. Just,et al.  The role of eye-fixation research in cognitive psychology , 1976 .

[40]  Mark Weiser,et al.  Some computer science issues in ubiquitous computing , 1993, CACM.

[41]  William J. Weiland,et al.  Eye-Voice-Controlled Interface , 1986 .

[42]  Chris Lankford,et al.  Gazetracker: software designed to facilitate eye movement analysis , 2000, ETRA.

[43]  D. S. Wooding,et al.  Fixation maps: quantifying eye-movement traces , 2002, ETRA.

[44]  Michael F. Land,et al.  Predictable eye-head coordination during driving , 1992, Nature.

[45]  H. Hartridge,et al.  METHODS OF INVESTIGATING EYE MOVEMENTS , 1948, The British journal of ophthalmology.

[46]  M. Just,et al.  Eye fixations and cognitive processes , 1976, Cognitive Psychology.

[47]  Joseph H. Goldberg,et al.  Eye tracking in web search tasks: design implications , 2002, ETRA.

[48]  B. Shackel Note on mobile eye viewpoint recording. , 1960, Journal of the Optical Society of America.

[49]  J. Anliker,et al.  Eye movements - On-line measurement, analysis, and control , 1976 .

[50]  Alfonso Fuggetta Panel Presentation , 1991, ESEC.

[51]  Andrea Lockerd Thomaz,et al.  Eye-R, a glasses-mounted eye motion detection interface , 2001, CHI Extended Abstracts.

[52]  Glenn F. Wilson,et al.  Psychophysiological responses to changes in workload during simulated air traffic control , 1996, Biological Psychology.

[53]  D. Schroeder,et al.  Blink Rate: A Possible Measure of Fatigue , 1994, Human factors.

[54]  Jeff B. Pelz,et al.  How people look at pictures before, during, and after scene capture: Buswell revisited , 2002, IS&T/SPIE Electronic Imaging.

[55]  Jochen Triesch,et al.  Saccade contingent updating in virtual reality , 2002, ETRA.

[56]  Andrew T. Duchowski,et al.  Eye Tracking Methodology: Theory and Practice , 2003, Springer London.

[57]  L vanRijswijk Bridging the gap between research and practice. , 2004 .

[58]  Richard A. Bolt,et al.  Gaze-orchestrated dynamic windows , 1981, SIGGRAPH '81.

[59]  D. F. Fisher,et al.  Eye movements : cognition and visual perception , 1982 .

[60]  Han Collewijn,et al.  Eye movement recording , 1998 .

[61]  M. Hayhoe,et al.  Memory representations guide targeting eye movements in a natural task , 2000 .

[62]  Dario D. Salvucci An interactive model-based environment for eye-movement protocol analysis and visualization , 2000, ETRA.

[63]  Jeffrey J. Hendrickson,et al.  Performance, preference, and visual scan patterns on a menu-based system: implications for interface design , 1989, CHI '89.

[64]  J. E. Russo,et al.  An Eye-Fixation Analysis of Choice Processes for Consumer Nondurables , 1994 .

[65]  Worthy N. Martin,et al.  Human-computer interaction using eye-gaze input , 1989, IEEE Trans. Syst. Man Cybern..

[66]  John R. Anderson,et al.  Intelligent gaze-added interfaces , 2000, CHI.

[67]  Robert J. K. Jacob,et al.  Interacting with eye movements in virtual environments , 2000, CHI.

[68]  Marc Levoy,et al.  Gaze-directed volume rendering , 1990, I3D '90.

[69]  Robert J. K. Jacob,et al.  What you look at is what you get: eye movement-based interaction techniques , 1990, CHI '90.

[70]  W. W. Peterson,et al.  Using Eye Movements to Classify Search Strategies , 1991 .

[71]  Reiner Onken,et al.  Detecting Usability Problems with Eye Tracking in Airborne Battle Management Support , 2000 .

[72]  J L Levine,et al.  Performance of an eyetracker for office use. , 1984, Computers in biology and medicine.

[73]  Christopher B. Currie,et al.  Visual stability across saccades while viewing complex pictures. , 1995, Journal of experimental psychology. Human perception and performance.

[74]  M. Land,et al.  The Roles of Vision and Eye Movements in the Control of Activities of Daily Living , 1998, Perception.

[75]  Joseph H. Goldberg,et al.  Identifying fixations and saccades in eye-tracking protocols , 2000, ETRA.

[76]  H D Crane,et al.  Accurate two-dimensional eye tracker using first and fourth Purkinje images. , 1973, Journal of the Optical Society of America.

[77]  Shumin Zhai,et al.  Manual and gaze input cascaded (MAGIC) pointing , 1999, CHI '99.

[78]  D. Coppola,et al.  Idiosyncratic characteristics of saccadic eye movements when viewing different visual environments , 1999, Vision Research.

[79]  Douglas C. Engelbart,et al.  A research center for augmenting human intellect , 1968, AFIPS Fall Joint Computing Conference.

[80]  R. Dodge,et al.  The angular velocity of eye movements , 1901 .

[81]  Joël Pynte,et al.  Reading as a Perceptual Process , 2000 .

[82]  E. Javal,et al.  Essai sur la physiologie de la lecture , 1878 .

[83]  A. L. I︠A︡rbus Eye Movements and Vision , 1967 .

[84]  R R Simmons,et al.  Methodological Considerations of Visual Workloads of Helicopter Pilots , 1979, Human factors.

[85]  Bryan Reimer,et al.  On-road driver eye movement tracking using head-mounted devices , 2002, ETRA.

[86]  Richard A. Bolt,et al.  A gaze-responsive self-disclosing display , 1990, CHI '90.

[87]  Aulikki Hyrskykari,et al.  101 spots, or how do users read menus? , 1998, CHI.

[88]  Roel Vertegaal,et al.  The GAZE groupware system: mediating joint attention in multiparty communication and collaboration , 1999, CHI '99.

[89]  Keith S. Karn,et al.  “Saccade pickers” vs. “fixation pickers”: the effect of eye tracking instrumentation on research , 2000, ETRA.

[90]  Richard A. Bolt Eyes at the interface , 1982, CHI '82.

[91]  M. Tinker Legibility of print , 1963 .

[92]  W. Graf,et al.  Ergonomic evaluation of user-interfaces by means of eye-movement data , 1989 .

[93]  C. H. Judd,et al.  General introduction to a series of studies of eye movements by means of kinetoscopic photographs , 1905 .

[94]  Randall L. Harris,et al.  What Do Pilots See in Displays? , 1980 .

[95]  John R. Anderson,et al.  Eye tracking the visual search of click-down menus , 1999, CHI '99.

[96]  Robert J. Hall,et al.  High-speed data processing and unobtrusive monitoring of eye movements , 1974 .