Chapter 22 – Voluntary Eye Movements in Human—Computer Interaction

Publisher Summary This chapter presents an overview on human–computer interaction (HCI) techniques that utilize voluntary eye movements for interaction with computers. The interaction among humans as well as interaction between human and computer can be classified as purely unimodal or multimodal. In unimodal interaction, communicative signals are transmitted through one modality only. An example of human unimodal interaction could be a discussion over the telephone, in which the sender uses speech only and the receiver uses hearing only. The chapter also explains interaction with computers. In multimodal interaction, communicative signals are transmitted through several different modalities. In comparison to human interaction, HCI is more limited in both input and output methods. Most systems allow inputs for a computer to be transmitted only through one modality and usually, this modality is the use of hands. This means that other possible input modalities that people use naturally in human–human interaction remains unused in HCI. However, it is possible to utilize other modalities in HCI too. Several different modalities, for example, speech, eye movements, and facial muscle activity, offer promising and interesting alternative methods to be used as an input for a computer. If the information from user to computer and vice versa could be transmitted through several modalities, this would conceivably result in more efficient, versatile, flexible, and eventually more natural interaction between the user and the computer, similar to human–human interaction.

[1]  Mikko Sams,et al.  Does audiovisual speech perception use information about facial configuration? , 2001 .

[2]  Richard A. Bolt,et al.  A gaze-responsive self-disclosing display , 1990, CHI '90.

[3]  Douglas J. Gillan,et al.  How does Fitts' law fit pointing and dragging? , 1990, CHI '90.

[4]  Veikko Surakka,et al.  Pupil size variation as an indication of affective processing , 2003, Int. J. Hum. Comput. Stud..

[5]  Robert J. K. Jacob,et al.  Interacting with eye movements in virtual environments , 2000, CHI.

[6]  J. Cassell,et al.  Nudge nudge wink wink: elements of face-to-face conversation for embodied conversational agents , 2001 .

[7]  Veikko Surakka Contagion and modulation of human emotions , 1998 .

[8]  John R. Anderson,et al.  Intelligent gaze-added interfaces , 2000, CHI.

[9]  Veikko Surakka,et al.  Combined Voluntary Gaze Direction and Facial Muscle Activity as a New Pointing Technique , 2001, INTERACT.

[10]  J. Hietanen,et al.  Facial and emotional reactions to Duchenne and non-Duchenne smiles. , 1998, International journal of psychophysiology : official journal of the International Organization of Psychophysiology.

[11]  N. Birbaumer,et al.  The thought translation device: a neurophysiological approach to communication in total motor paralysis , 1999, Experimental Brain Research.

[12]  Colin Ware,et al.  An evaluation of an eye tracker as a device for computer input2 , 1987, CHI 1987.

[13]  Shumin Zhai,et al.  Manual and gaze input cascaded (MAGIC) pointing , 1999, CHI '99.

[14]  I. Scott MacKenzie,et al.  Movement time prediction in human-computer interfaces , 1992 .

[15]  R A Abrams,et al.  Speed and accuracy of saccadic eye movements: characteristics of impulse variability in the oculomotor system. , 1989, Journal of experimental psychology. Human perception and performance.

[16]  A. J. Fridlund,et al.  Guidelines for human electromyographic research. , 1986, Psychophysiology.

[17]  Robert J. K. Jacob,et al.  Evaluation of eye gaze interaction , 2000, CHI.

[18]  Päivi Majaranta,et al.  Twenty years of eye typing: systems and design issues , 2002, ETRA.

[19]  Päivi Majaranta,et al.  Design issues of iDICT: a gaze-assisted translation aid , 2000, ETRA.

[20]  P. Fitts The information capacity of the human motor system in controlling the amplitude of movement. , 1954, Journal of experimental psychology.

[21]  Poika Isokoski,et al.  Text input methods for eye trackers using off-screen targets , 2000, ETRA.

[22]  Martti Juhola,et al.  Neural Network and Wavelet Recognition of Facial Electromyographic Signals , 2001, MedInfo.

[23]  Darius Miniotas Application of Fitts' law to eye gaze interaction , 2000, CHI Extended Abstracts.

[24]  Robert J. K. Jacob,et al.  Evaluation and Analysis of Eye Gaze Interaction , 2001 .

[25]  Sarah A. Douglas,et al.  The effect of reducing homing time on the speed of a finger-controlled isometric pointing device , 1994, CHI.

[26]  K. Rayner Eye movements in reading and information processing: 20 years of research. , 1998, Psychological bulletin.

[27]  Mikko Sams,et al.  McGurk effect in Finnish syllables, isolated words, and words in sentences: Effects of word meaning and sentence context , 1998, Speech Commun..

[28]  Gilbert Cockton,et al.  The "Cyberlink" Brain-Body Interface as an Assistive Technology for Persons with Traumatic Brain Injury: Longitudinal Results from a Group of Case Studies , 1999, Cyberpsychology Behav. Soc. Netw..

[29]  John L. Sibert,et al.  The reading assistant: eye gaze triggered auditory prompting for reading remediation , 2000, UIST '00.

[30]  Veikko Surakka,et al.  Pupillary responses to emotionally provocative stimuli , 2000, ETRA.

[31]  J. Hietanen,et al.  Facial electromyographic responses to vocal affect expressions. , 1998, Psychophysiology.

[32]  Eyal M. Reingold,et al.  Selection By Looking: A Novel Computer Interface And Its Application To Psychological Research , 1995 .

[33]  Joseph H. Goldberg,et al.  Identifying fixations and saccades in eye-tracking protocols , 2000, ETRA.

[34]  Kasper Hornbæk,et al.  Measuring usability: are effectiveness, efficiency, and satisfaction really correlated? , 2000, CHI.