Observers' cognitive states modulate how visual inputs relate to gaze control.

Previous research has shown that eye-movements change depending on both the visual features of our environment, and the viewer's top-down knowledge. One important question that is unclear is the degree to which the visual goals of the viewer modulate how visual features of scenes guide eye-movements. Here, we propose a systematic framework to investigate this question. In our study, participants performed 3 different visual tasks on 135 scenes: search, memorization, and aesthetic judgment, while their eye-movements were tracked. Canonical correlation analyses showed that eye-movements were reliably more related to low-level visual features at fixations during the visual search task compared to the aesthetic judgment and scene memorization tasks. Different visual features also had different relevance to eye-movements between tasks. This modulation of the relationship between visual features and eye-movements by task was also demonstrated with classification analyses, where classifiers were trained to predict the viewing task based on eye movements and visual features at fixations. Feature loadings showed that the visual features at fixations could signal task differences independent of temporal and spatial properties of eye-movements. When classifying across participants, edge density and saliency at fixations were as important as eye-movements in the successful prediction of task, with entropy and hue also being significant, but with smaller effect sizes. When classifying within participants, brightness and saturation were also significant contributors. Canonical correlation and classification results, together with a test of moderation versus mediation, suggest that the cognitive state of the observer moderates the relationship between stimulus-driven visual features and eye-movements. (PsycINFO Database Record

[1]  Michelle R. Greene,et al.  Reconsidering Yarbus: A failure to predict observers’ task from eye movement patterns , 2012, Vision Research.

[2]  George L. Malcolm,et al.  Searching in the dark: Cognitive relevance drives attention in real-world scenes , 2009, Psychonomic bulletin & review.

[3]  J. Henderson,et al.  The effects of semantic consistency on eye movements during complex scene viewing , 1999 .

[4]  A. Kingstone,et al.  Saliency does not account for fixations to eyes within social scenes , 2009, Vision Research.

[5]  Myriam Chanceaux,et al.  The influence of clutter on real-world scene search: evidence from search efficiency and eye movements. , 2009, Journal of vision.

[6]  J. Henderson,et al.  Classifying mental states from eye movements during scene viewing. , 2015, Journal of experimental psychology. Human perception and performance.

[7]  Christof Koch,et al.  Modeling attention to salient proto-objects , 2006, Neural Networks.

[8]  Michael D. Dodd,et al.  Examining the influence of task set on eye movements and fixations. , 2011, Journal of vision.

[9]  Dirk B. Walther,et al.  Dissociation of salience-driven and content-driven spatial attention to scene category with predictive decoding of gaze patterns. , 2015, Journal of vision.

[10]  S Ullman,et al.  Shifts in selective visual attention: towards the underlying neural circuitry. , 1985, Human neurobiology.

[11]  Christof Koch,et al.  Predicting human gaze using low-level saliency combined with face detection , 2007, NIPS.

[12]  T. Paus,et al.  Neighborhood greenspace and health in a large urban center , 2015, Scientific Reports.

[13]  L. Itti,et al.  Defending Yarbus: eye movements reveal observers' task. , 2014, Journal of vision.

[14]  Ali Borji,et al.  State-of-the-Art in Visual Attention Modeling , 2013, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[15]  Nancy Millette,et al.  How People Look at Pictures , 1935 .

[16]  Moreno I. Coco,et al.  Classification of visual and linguistic tasks using eye-movement features. , 2014, Journal of vision.

[17]  H. Hotelling Relations Between Two Sets of Variates , 1936 .

[18]  Richard A. Johnson,et al.  Applied Multivariate Statistical Analysis , 1983 .

[19]  Steven G. Luke,et al.  Stable individual differences in saccadic eye movements during reading, pseudoreading, scene viewing, and scene search. , 2014, Journal of experimental psychology. Human perception and performance.

[20]  Nicola C. Anderson,et al.  It depends on when you look at it: Salience influences eye movements in natural scene viewing and search early in time. , 2015, Journal of vision.

[21]  B. Tatler,et al.  Looking and Acting: Vision and eye movements in natural behaviour , 2009 .

[22]  O. Mimura [Eye movements]. , 1992, Nippon Ganka Gakkai zasshi.

[23]  C. Koch,et al.  A saliency-based search mechanism for overt and covert shifts of visual attention , 2000, Vision Research.

[24]  Michael L. Mack,et al.  VISUAL SALIENCY DOES NOT ACCOUNT FOR EYE MOVEMENTS DURING VISUAL SEARCH IN REAL-WORLD SCENES , 2007 .

[25]  Derrick J. Parkhurst,et al.  Modeling the role of salience in the allocation of overt visual attention , 2002, Vision Research.

[26]  Antonio Torralba,et al.  Understanding and Predicting Image Memorability at a Large Scale , 2015, 2015 IEEE International Conference on Computer Vision (ICCV).

[27]  A. L. Yarbus,et al.  Eye Movements and Vision , 1967, Springer US.

[28]  Michael L. Mack,et al.  Viewing task influences eye movement control during active scene perception. , 2009, Journal of vision.

[29]  J. Henderson Regarding Scenes , 2007 .

[30]  Grigori Yourganov,et al.  The Perception of Naturalness Correlates with Low-Level Visual Features of Environmental Scenes , 2014, PloS one.

[31]  A. Treisman,et al.  A feature-integration theory of attention , 1980, Cognitive Psychology.

[32]  Christof Koch,et al.  A Model of Saliency-Based Visual Attention for Rapid Scene Analysis , 2009 .

[33]  J. Wolfe Visual search in continuous, naturalistic stimuli , 1994, Vision Research.

[34]  D. Coppola,et al.  Idiosyncratic characteristics of saccadic eye movements when viewing different visual environments , 1999, Vision Research.

[35]  Michael C. Hout,et al.  Is the preference of natural versus man-made scenes driven by bottom–up processing of the visual features of nature? , 2015, Front. Psychol..

[36]  John M. Henderson,et al.  Predicting Cognitive State from Eye Movements , 2013, PloS one.