Classification of visual and linguistic tasks using eye-movement features.

The role of the task has received special attention in visual-cognition research because it can provide causal explanations of goal-directed eye-movement responses. The dependency between visual attention and task suggests that eye movements can be used to classify the task being performed. A recent study by Greene, Liu, and Wolfe (2012), however, fails to achieve accurate classification of visual tasks based on eye-movement features. In the present study, we hypothesize that tasks can be successfully classified when they differ with respect to the involvement of other cognitive domains, such as language processing. We extract the eye-movement features used by Greene et al. as well as additional features from the data of three different tasks: visual search, object naming, and scene description. First, we demonstrated that eye-movement responses make it possible to characterize the goals of these tasks. Then, we trained three different types of classifiers and predicted the task participants performed with an accuracy well above chance (a maximum of 88% for visual search). An analysis of the relative importance of features for classification accuracy reveals that just one feature, i.e., initiation time, is sufficient for above-chance performance (a maximum of 79% accuracy in object naming). Crucially, this feature is independent of task duration, which differs systematically across the three tasks we investigated. Overall, the best task classification performance was obtained with a set of seven features that included both spatial information (e.g., entropy of attention allocation) and temporal components (e.g., total fixation on objects) of the eye-movement record. This result confirms the task-dependent allocation of visual attention and extends previous work by showing that task classification is possible when tasks differ in the cognitive processes involved (purely visual tasks such as search vs. communicative tasks such as scene description).

[1]  Brian D. Ripley,et al.  Modern applied statistics with S, 4th Edition , 2002, Statistics and computing.

[2]  M. Hayhoe,et al.  In what ways do eye movements contribute to everyday activities? , 2001, Vision Research.

[3]  D. Ballard,et al.  Memory Representations in Natural Tasks , 1995, Journal of Cognitive Neuroscience.

[4]  Antonio Torralba,et al.  Contextual guidance of eye movements and attention in real-world scenes: the role of global features in object search. , 2006, Psychological review.

[5]  J. Pelz,et al.  Oculomotor behavior and perceptual strategies in complex tasks , 2001, Vision Research.

[6]  Viola S. Störmer,et al.  Feature-based interference from unattended visual field during attentional tracking in younger and older adults. , 2011, Journal of vision.

[7]  G. Altmann,et al.  Incremental interpretation at verbs: restricting the domain of subsequent reference , 1999, Cognition.

[8]  Julie C. Sedivy,et al.  Eye movements and spoken language comprehension: Effects of visual context on syntactic ambiguity resolution , 2002, Cognitive Psychology.

[9]  Marianne A. DeAngelus,et al.  Top-down control of eye movements: Yarbus revisited , 2009 .

[10]  Moreno I. Coco,et al.  The Interplay of Bottom-Up and Top-Down Mechanisms in Visual Guidance during Object Naming , 2011, Quarterly journal of experimental psychology.

[11]  Mary M Hayhoe,et al.  Task and context determine where you look. , 2016, Journal of vision.

[12]  Moreno I. Coco,et al.  Scan Patterns Predict Sentence Production in the Cross-Modal Processing of Visual Scenes , 2012, Cogn. Sci..

[13]  N. Hagemann,et al.  Visual perception in fencing: Do the eye movements of fencers represent their information pickup? , 2010 .

[14]  George L. Malcolm,et al.  The effects of target template specificity on visual search in real-world scenes: evidence from eye movements. , 2009, Journal of vision.

[15]  L. Gleitman,et al.  On the give and take between event apprehension and utterance formulation. , 2007, Journal of memory and language.

[16]  Gerry T. M. Altmann,et al.  of Experimental Psychology : Human Perception and Performance Attentional Capture of Objects Referred to by Spoken Language , 2011 .

[17]  Michelle R. Greene,et al.  Reconsidering Yarbus: A failure to predict observers’ task from eye movement patterns , 2012, Vision Research.

[18]  Michael L. Mack,et al.  Viewing task influences eye movement control during active scene perception. , 2009, Journal of vision.

[19]  R. Baayen,et al.  Mixed-effects modeling with crossed random effects for subjects and items , 2008 .

[20]  M F Land,et al.  The knowledge base of the oculomotor system. , 1997, Philosophical transactions of the Royal Society of London. Series B, Biological sciences.

[21]  M. Land,et al.  The Roles of Vision and Eye Movements in the Control of Activities of Daily Living , 1998, Perception.

[22]  M. Castelhano,et al.  The relative contribution of scene context and target features to visual search in scenes , 2010, Attention, perception & psychophysics.

[23]  E. Marshall,et al.  NIMH: caught in the line of fire without a general , 1995, Science.

[24]  Moreno I. Coco,et al.  The Impact of Visual Information on Reference Assignment in Sentence Production , 2009 .

[25]  Michael D. Dodd,et al.  Examining the influence of task set on eye movements and fixations. , 2011, Journal of vision.

[26]  Antonio Torralba,et al.  LabelMe: A Database and Web-Based Tool for Image Annotation , 2008, International Journal of Computer Vision.

[27]  K. Rayner,et al.  Parafoveal word processing during eye fixations in reading: Effects of word frequency , 1986, Perception & psychophysics.

[28]  J. Henderson Human gaze control during real-world scene perception , 2003, Trends in Cognitive Sciences.

[29]  D. Ballard,et al.  What you see is what you need. , 2003, Journal of vision.

[30]  K. Rayner The 35th Sir Frederick Bartlett Lecture: Eye movements and attention in reading, scene perception, and visual search , 2009, Quarterly journal of experimental psychology.

[31]  Trevor Hastie,et al.  Regularization Paths for Generalized Linear Models via Coordinate Descent. , 2010, Journal of statistical software.

[32]  Chih-Jen Lin,et al.  LIBSVM: A library for support vector machines , 2011, TIST.

[33]  J. Findlay,et al.  Rapid Detection of Person Information in a Naturalistic Scene , 2008, Perception.

[34]  William N. Venables,et al.  Modern Applied Statistics with S , 2010 .

[35]  Eyal M. Reingold,et al.  Direct control of fixation times in scene viewing: Evidence from analysis of the distribution of first fixation duration , 2012 .

[36]  D. Barr,et al.  Random effects structure for confirmatory hypothesis testing: Keep it maximal. , 2013, Journal of memory and language.

[37]  P. Perona,et al.  Objects predict fixations better than early saliency. , 2008, Journal of vision.

[38]  H. Ritter,et al.  Disambiguating Complex Visual Information: Towards Communication of Personal Views of a Scene , 1996, Perception.

[39]  Moreno I. Coco,et al.  Memory modulated saliency: A computational model of the incremental learning of target locations in visual search , 2013, CogSci.

[40]  D. Ballard,et al.  Modelling the role of task in the control of gaze , 2009, Visual cognition.

[41]  Zenzi M. Griffin,et al.  PSYCHOLOGICAL SCIENCE Research Article WHAT THE EYES SAY ABOUT SPEAKING , 2022 .

[42]  R. Baddeley,et al.  The long and the short of it: Spatial statistics at fixation vary with saccade amplitude and task , 2006, Vision Research.

[43]  John M. Henderson,et al.  Predicting Cognitive State from Eye Movements , 2013, PloS one.

[44]  N. Hagemann,et al.  Visual perception in fencing: Do the eye movements of fencers represent their information pickup? , 2010, Attention, perception & psychophysics.

[45]  Michael F. Land,et al.  From eye movements to actions: how batsmen hit the ball , 2000, Nature Neuroscience.

[46]  A. L. I︠A︡rbus Eye Movements and Vision , 1967 .

[47]  A. L. Yarbus,et al.  Eye Movements and Vision , 1967, Springer US.

[48]  George L. Malcolm,et al.  Combining top-down processes to guide eye movements during real-world scene search. , 2010, Journal of vision.

[49]  Roger M. Cooper,et al.  The control of eye fixation by the meaning of spoken language: A new methodology for the real-time investigation of speech perception, memory, and language processing. , 1974 .

[50]  Julie C. Sedivy,et al.  Subject Terms: Linguistics Language Eyes & eyesight Cognition & reasoning , 1995 .

[51]  M. Coltheart,et al.  The quarterly journal of experimental psychology , 1985 .

[52]  D. Bates,et al.  Linear Mixed-Effects Models using 'Eigen' and S4 , 2015 .

[53]  Yuanzhen Li,et al.  Measuring visual clutter. , 2007, Journal of vision.

[54]  John J. B. Allen,et al.  Neurophysiological evidence for the influence of past experience on figure-ground perception. , 2010, Journal of vision.

[55]  R. C. Langford How People Look at Pictures, A Study of the Psychology of Perception in Art. , 1936 .

[56]  L. Stark,et al.  Scanpaths in saccadic eye movements while viewing and recognizing patterns. , 1971, Vision research.