Predicting Cognitive State from Eye Movements

In human vision, acuity and color sensitivity are greatest at the center of fixation and fall off rapidly as visual eccentricity increases. Humans exploit the high resolution of central vision by actively moving their eyes three to four times each second. Here we demonstrate that it is possible to classify the task that a person is engaged in from their eye movements using multivariate pattern classification. The results have important theoretical implications for computational and neural models of eye movement control. They also have important practical implications for using passively recorded eye movements to infer the cognitive state of a viewer, information that can be used as input for intelligent human-computer interfaces and related applications.

[1]  E. Matin Saccadic suppression: a review and an analysis. , 1974, Psychological bulletin.

[2]  R. Ratcliff Group reaction time distributions and an analysis of distribution statistics. , 1979, Psychological bulletin.

[3]  F. C. Volkmann Human visual suppression , 1986, Vision Research.

[4]  J. O'Regan,et al.  Mindless reading: Eye-movement characteristics are similar in scanning letter strings and reading texts , 1995, Perception & psychophysics.

[5]  K. Rayner,et al.  Mindless reading revisited: Eye movements during reading and scanning are different , 1996, Perception & psychophysics.

[6]  Thomas G. Dietterich What is machine learning? , 2020, Archives of Disease in Childhood.

[7]  Erik D. Reichle,et al.  Toward a model of eye movement control in reading. , 1998, Psychological review.

[8]  Andrew Hollingworth,et al.  Eye Movements During Scene Viewing: An Overview , 1998 .

[9]  K. Rayner Eye movements in reading and information processing: 20 years of research. , 1998, Psychological bulletin.

[10]  J. Henderson,et al.  The Role of Fixation Position in Detecting Scene Changes Across Saccades , 1999 .

[11]  J. Henderson,et al.  The effects of semantic consistency on eye movements during complex scene viewing , 1999 .

[12]  C. Koch,et al.  Computational modelling of visual attention , 2001, Nature Reviews Neuroscience.

[13]  K. Hoffmann,et al.  Neural Mechanisms of Saccadic Suppression , 2002, Science.

[14]  Ralf Engbert,et al.  A dynamical model of saccade generation in reading based on spatially distributed lexical processing , 2002, Vision Research.

[15]  Derrick J. Parkhurst,et al.  Modeling the role of salience in the allocation of overt visual attention , 2002, Vision Research.

[16]  Denis Cousineau,et al.  QMPE: Estimating Lognormal, Wald, and Weibull RT distributions with a parameter-dependent lower bound , 2004, Behavior research methods, instruments, & computers : a journal of the Psychonomic Society, Inc.

[17]  R. Baddeley,et al.  The long and the short of it: Spatial statistics at fixation vary with saccade amplitude and task , 2006, Vision Research.

[18]  A. Mizuno,et al.  A change of the leading player in flow Visualization technique , 2006, J. Vis..

[19]  Sean M. Polyn,et al.  Beyond mind-reading: multi-voxel pattern analysis of fMRI data , 2006, Trends in Cognitive Sciences.

[20]  Antonio Torralba,et al.  Contextual guidance of eye movements and attention in real-world scenes: the role of global features in object search. , 2006, Psychological review.

[21]  Radford M. Neal Pattern Recognition and Machine Learning , 2007, Technometrics.

[22]  Alice J. O'Toole,et al.  Theoretical, Statistical, and Practical Perspectives on Pattern-based Classification Approaches to the Analysis of Functional Neuroimaging Data , 2007, Journal of Cognitive Neuroscience.

[23]  Tom M. Mitchell,et al.  Machine learning classifiers and fMRI: A tutorial overview , 2009, NeuroImage.

[24]  Michael L. Mack,et al.  Viewing task influences eye movement control during active scene perception. , 2009, Journal of vision.

[25]  J. Henderson,et al.  Does gravity matter? Effects of semantic and syntactic inconsistencies on the allocation of attention during scene perception. , 2009, Journal of vision.

[26]  Sarah J. White,et al.  Distributional effects of word frequency on eye fixation durations. , 2010, Journal of experimental psychology. Human perception and performance.

[27]  J. Haynes Brain Reading: Decoding Mental States From Brain Activity In Humans , 2011 .

[28]  D. Balota,et al.  Moving Beyond the Mean in Studies of Mental Chronometry , 2011 .

[29]  J. Henderson,et al.  Oculomotor inhibition of return in normal and mindless reading , 2012, Psychonomic bulletin & review.

[30]  J. Henderson,et al.  Using CRISP to model global characteristics of fixation durations in scene viewing and reading with a common mechanism , 2012 .

[31]  Michelle R. Greene,et al.  Reconsidering Yarbus: A failure to predict observers’ task from eye movement patterns , 2012, Vision Research.

[32]  Steven G. Luke,et al.  Eye movement control in scene viewing and reading: evidence from the stimulus onset delay paradigm. , 2013, Journal of experimental psychology. Human perception and performance.