The Role of Top-down and Bottom-up Processes in Guiding Eye Movements during Visual Search

To investigate how top-down (TD) and bottom-up (BU) information is weighted in the guidance of human search behavior, we manipulated the proportions of BU and TD components in a saliency-based model. The model is biologically plausible and implements an artificial retina and a neuronal population code. The BU component is based on feature-contrast. The TD component is defined by a feature-template match to a stored target representation. We compared the model's behavior at different mixtures of TD and BU components to the eye movement behavior of human observers performing the identical search task. We found that a purely TD model provides a much closer match to human behavior than any mixture model using BU information. Only when biological constraints are removed (e.g., eliminating the retina) did a BU/TD mixture model begin to approximate human behavior.

[1]  Wilson S. Geisler,et al.  Gaze-contingent real-time simulation of arbitrary visual fields , 2002, IS&T/SPIE Electronic Imaging.

[2]  Rajesh P. N. Rao,et al.  Modeling Saccadic Targeting in Visual Search , 1995, NIPS.

[3]  J. Findlay Global visual processing for saccadic eye movements , 1982, Vision Research.

[4]  Rajesh P. N. Rao,et al.  Eye movements in iconic visual search , 2002, Vision Research.

[5]  D. Sparks,et al.  Population coding of saccadic eye movements by neurons in the superior colliculus , 1988, Nature.

[6]  James L. McClelland,et al.  Parallel distributed processing: explorations in the microstructure of cognition, vol. 1: foundations , 1986 .

[7]  K. Turano,et al.  Oculomotor strategies for the direction of gaze tested with a real-world activity , 2003, Vision Research.

[8]  Gregory J Zelinsky,et al.  Marking rejected distractors: A gaze-contingent technique for measuring memory during search , 2005, Psychonomic bulletin & review.

[9]  KochChristof,et al.  A Model of Saliency-Based Visual Attention for Rapid Scene Analysis , 1998 .

[10]  A. Treisman,et al.  A feature-integration theory of attention , 1980, Cognitive Psychology.

[11]  Susan L. Franzel,et al.  Guided search: an alternative to the feature integration model for visual search. , 1989, Journal of experimental psychology. Human perception and performance.

[12]  S Ullman,et al.  Shifts in selective visual attention: towards the underlying neural circuitry. , 1985, Human neurobiology.

[13]  Alexander Toet,et al.  Image dataset for testing search and detection models , 2001 .

[14]  C. Koch,et al.  A saliency-based search mechanism for overt and covert shifts of visual attention , 2000, Vision Research.

[15]  L. Itti,et al.  Modeling the influence of task on attention , 2005, Vision Research.

[16]  R. Klein,et al.  Inhibition of Return is a Foraging Facilitator in Visual Search , 1999 .

[17]  J. Wolfe,et al.  Guided Search 2.0 A revised model of visual search , 1994, Psychonomic bulletin & review.

[18]  Jeffrey L. Elman,et al.  Finding Structure in Time , 1990, Cogn. Sci..

[19]  C. Koch,et al.  Computational modelling of visual attention , 2001, Nature Reviews Neuroscience.

[20]  Alexander Toet,et al.  A high-resolution image data set for testing search and detection models , 1999 .

[21]  T. Sejnowski Neural populations revealed , 1988, Nature.

[22]  Rajesh P. N. Rao,et al.  PSYCHOLOGICAL SCIENCE Research Article EYE MOVEMENTS REVEAL THE SPATIOTEMPORAL DYNAMICS OE VISUAL SEARCH , 2022 .