The right look for the job: decoding cognitive processes involved in the task from spatial eye-movement patterns

The aim of the study was not only to demonstrate whether eye-movement-based task decoding was possible but also to investigate whether eye-movement patterns can be used to identify cognitive processes behind the tasks. We compared eye-movement patterns elicited under different task conditions, with tasks differing systematically with regard to the types of cognitive processes involved in solving them. We used four tasks, differing along two dimensions: spatial (global vs. local) processing (Navon, Cognit Psychol, 9(3):353–383 1977 ) and semantic (deep vs. shallow) processing (Craik and Lockhart, J Verbal Learn Verbal Behav, 11(6):671–684 1972 ). We used eye-movement patterns obtained from two time periods: fixation cross preceding the target stimulus and the target stimulus. We found significant effects of both spatial and semantic processing, but in case of the latter, the effect might be an artefact of insufficient task control. We found above chance task classification accuracy for both time periods: 51.4% for the period of stimulus presentation and 34.8% for the period of fixation cross presentation. Therefore, we show that task can be to some extent decoded from the preparatory eye-movements before the stimulus is displayed. This suggests that anticipatory eye-movements reflect the visual scanning strategy employed for the task at hand. Finally, this study also demonstrates that decoding is possible even from very scant eye-movement data similar to Coco and Keller, J Vis 14(3):11–11 ( 2014 ). This means that task decoding is not limited to tasks that naturally take longer to perform and yield multi-second eye-movement recordings.

[1]  D. Navon Forest before trees: The precedence of global features in visual perception , 1977, Cognitive Psychology.

[2]  F. Craik,et al.  Levels of processing: A framework for memory research , 1972 .

[3]  James J. Clark,et al.  An inverse Yarbus process: Predicting observers’ task from eye movement patterns , 2014, Vision Research.

[4]  Michael L. Mack,et al.  Viewing task influences eye movement control during active scene perception. , 2009, Journal of vision.

[5]  Garrison W. Cottrell,et al.  Predicting an observer's task using multi-fixation pattern analysis , 2014, ETRA.

[6]  A. L. I︠A︡rbus Eye Movements and Vision , 1967 .

[7]  Ilona M. Bloem,et al.  Scrutinizing visual images: The role of gaze in mental imagery and memory , 2014, Cognition.

[8]  John M. Henderson,et al.  Predicting Cognitive State from Eye Movements , 2013, PloS one.

[9]  H Intraub,et al.  Levels of processing and picture memory: the physical superiority effect. , 1985, Journal of experimental psychology. Learning, memory, and cognition.

[10]  G. Bower,et al.  Depth of processing pictures of faces and recognition memory , 1974 .

[11]  Jason Bell,et al.  Local motion effects on form in radial frequency patterns. , 2010, Journal of vision.

[12]  Linden J. Ball,et al.  An Eye Movement Analysis of Web Page Usability , 2002 .

[13]  Marianne A. DeAngelus,et al.  Top-down control of eye movements: Yarbus revisited , 2009 .

[14]  Tim C Kietzmann,et al.  Investigating task-dependent top-down effects on overt visual attention. , 2010, Journal of vision.

[15]  N. Mackworth,et al.  Cognitive determinants of fixation location during picture viewing. , 1978, Journal of experimental psychology. Human perception and performance.

[16]  G. Altmann Language-mediated eye movements in the absence of a visual world: the ‘blank screen paradigm’ , 2004, Cognition.

[17]  B. Tatler,et al.  Yarbus, eye movements, and vision , 2010, i-Perception.

[18]  F. Craik,et al.  Levels of Pro-cessing: A Framework for Memory Research , 1975 .

[19]  A. Friedman Framing pictures: the role of knowledge in automatized encoding and memory for gist. , 1979, Journal of experimental psychology. General.

[20]  J. Henderson,et al.  High-level scene perception. , 1999, Annual review of psychology.

[21]  Alan Kennedy,et al.  Book Review: Eye Tracking: A Comprehensive Guide to Methods and Measures , 2016, Quarterly journal of experimental psychology.

[22]  Kenneth Holmqvist,et al.  Eye tracking: a comprehensive guide to methods and measures , 2011 .

[23]  A. L. Yarbus,et al.  Eye Movements and Vision , 1967, Springer US.

[24]  Michael D. Dodd,et al.  Examining the influence of task set on eye movements and fixations. , 2011, Journal of vision.

[25]  Fang Fang,et al.  Attention modulates neuronal correlates of interhemispheric integration and global motion perception. , 2014, Journal of vision.

[26]  Grigori Yourganov,et al.  Observers' cognitive states modulate how visual inputs relate to gaze control. , 2016, Journal of experimental psychology. Human perception and performance.

[27]  Cheryl L Grady,et al.  The effect of encoding strategy on the neural correlates of memory for faces , 2002, Neuropsychologia.

[28]  A R McIntosh,et al.  Neural correlates of the episodic encoding of pictures and words. , 1998, Proceedings of the National Academy of Sciences of the United States of America.

[29]  J. Henderson,et al.  Classifying mental states from eye movements during scene viewing. , 2015, Journal of experimental psychology. Human perception and performance.

[30]  Michelle R. Greene,et al.  Reconsidering Yarbus: A failure to predict observers’ task from eye movement patterns , 2012, Vision Research.

[31]  Krista A. Ehinger,et al.  Modelling search for people in 900 scenes: A combined source model of eye guidance , 2009 .

[32]  Peter König,et al.  Influence of Low-Level Stimulus Features, Task Dependent Factors, and Spatial Biases on Overt Visual Attention , 2010, PLoS Comput. Biol..

[33]  Neil D. B. Bruce,et al.  Predicting task from eye movements: On the importance of spatial distribution, dynamics, and image features , 2016, Neurocomputing.

[34]  L. Itti,et al.  Defending Yarbus: eye movements reveal observers' task. , 2014, Journal of vision.

[35]  Moreno I. Coco,et al.  Classification of visual and linguistic tasks using eye-movement features. , 2014, Journal of vision.