Where to look next? Eye movements reduce local uncertainty.

How do we decide where to look next? During natural, active vision, we move our eyes to gather task-relevant information from the visual scene. Information theory provides an elegant framework for investigating how visual stimulus information combines with prior knowledge and task goals to plan an eye movement. We measured eye movements as observers performed a shape-learning and -matching task, for which the task-relevant information was tightly controlled. Using computational models, we probe the underlying strategies used by observers when planning their next eye movement. One strategy is to move the eyes to locations that maximize the total information gained about the shape, which is equivalent to reducing global uncertainty. Observers' behavior may appear highly similar to this strategy, but a rigorous analysis of sequential fixation placement reveals that observers may instead be using a local rule: fixate only the most informative locations, that is, reduce local uncertainty.

[1]  Tai Sing Lee,et al.  An Information-Theoretic Framework for Understanding Saccadic Eye Movements , 1999, NIPS.

[2]  Pierre Baldi,et al.  A principled approach to detecting surprising events in video , 2005, 2005 IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR'05).

[3]  Robert M. McPeek,et al.  Concurrent processing of saccades in visual search , 2000, Vision Research.

[4]  H. Basford,et al.  Optimal eye movement strategies in visual search , 2005 .

[5]  Eileen Kowler,et al.  Shapes, surfaces and saccades , 1999, Vision Research.

[6]  R. Carpenter,et al.  Movements of the Eyes , 1978 .

[7]  D. Ballard,et al.  Eye movements in natural behavior , 2005, Trends in Cognitive Sciences.

[8]  Lynn A. Olzak,et al.  Configural effects constrain fourier models of pattern discrimination , 1992, Vision Research.

[9]  Donald Geman,et al.  An Active Testing Model for Tracking Roads in Satellite Images , 1996, IEEE Trans. Pattern Anal. Mach. Intell..

[10]  C. Koch,et al.  A saliency-based search mechanism for overt and covert shifts of visual attention , 2000, Vision Research.

[11]  Miguel P Eckstein,et al.  The time course of visual information accrual guiding eye movement decisions. , 2004, Proceedings of the National Academy of Sciences of the United States of America.

[12]  Preeti Verghese,et al.  Modeling eye movements in a shape discrimination task , 2005 .

[13]  G. Legge,et al.  Mr. Chips 2002: new insights from an ideal-observer model of reading , 2002, Vision Research.

[14]  Alan C Bovik,et al.  Contrast statistics for foveated visual systems: fixation selection by minimizing contrast entropy. , 2005, Journal of the Optical Society of America. A, Optics, image science, and vision.

[15]  D. Levi,et al.  Spatial localization without visual references , 1992, Vision Research.

[16]  Tai Sing Lee,et al.  Hierarchical Bayesian inference in the visual cortex. , 2003, Journal of the Optical Society of America. A, Optics, image science, and vision.

[17]  F. Hamker The reentry hypothesis: linking eye movements to visual perception. , 2003, Journal of vision.

[18]  Rajesh P. N. Rao,et al.  Eye movements in iconic visual search , 2002, Vision Research.

[19]  W. Newsome,et al.  Neural basis of a perceptual decision in the parietal cortex (area LIP) of the rhesus monkey. , 2001, Journal of neurophysiology.

[20]  Iain D. Gilchrist,et al.  Visual correlates of fixation selection: effects of scale and time , 2005, Vision Research.

[21]  R. Klein,et al.  Contribution of the Primate Superior Colliculus to Inhibition of Return , 2002, Journal of Cognitive Neuroscience.

[22]  P Reinagel,et al.  Natural scene statistics at the centre of gaze. , 1999, Network.

[23]  M. Land,et al.  The Roles of Vision and Eye Movements in the Control of Activities of Daily Living , 1998, Perception.

[24]  J. G. Snodgrass,et al.  A standardized set of 260 pictures: norms for name agreement, image agreement, familiarity, and visual complexity. , 1980, Journal of experimental psychology. Human learning and memory.

[25]  S. Klein,et al.  Vernier acuity, crowding and cortical magnification , 1985, Vision Research.

[26]  Jitendra Malik,et al.  An Information Maximization Model of Eye Movements , 2004, NIPS.

[27]  G. Legge,et al.  Mr. Chips: An ideal-observer model of reading , 1997 .

[28]  D H Brainard,et al.  The Psychophysics Toolbox. , 1997, Spatial vision.

[29]  Eileen Kowler,et al.  Localization of shapes: eye movements and perception compared , 2003, Vision Research.

[30]  L. Stark,et al.  Most naturally occurring human saccades have magnitudes of 15 degrees or less. , 1975, Investigative ophthalmology.

[31]  A. L. I︠A︡rbus Eye Movements and Vision , 1967 .

[32]  Eileen Kowler,et al.  Eye movements during visual search: the costs of choosing the optimal path , 2001, Vision Research.

[33]  Asha Iyer,et al.  Components of bottom-up gaze allocation in natural images , 2005, Vision Research.

[34]  Mary M Hayhoe,et al.  Visual memory and motor planning in a natural task. , 2003, Journal of vision.