Efficient saccade planning requires time and clear choices

We use eye movements constantly to gather information. Saccades are efficient when they maximize the information required for the task, however there is controversy regarding the efficiency of eye movement planning. For example, saccades are efficient when searching for a single target (Nature, 434 (2005) 387-391), but are inefficient when searching for an unknown number of targets in noise, particularly under time pressure (Vision Research 74 (2012), 61-71). In this study, we used a multiple-target search paradigm and explored whether altering the noise level or increasing saccadic latency improved efficiency. Experiments used stimuli with two levels of discriminability such that saccades to the less discriminable stimuli provided more information. When these two noise levels corresponded to low and moderate visibility, most observers did not preferentially select informative locations, but looked at uncertain and probable target locations equally often. We then examined whether eye movements could be made more efficient by increasing the discriminability of the two stimulus levels and by delaying the first saccade so that there was more time for decision processes to influence the saccade choices. Some observers did indeed increase the proportion of their saccades to informative locations under these conditions. Others, however, made as many saccades as they could during the limited time and were unselective about the saccade goal. A clear trend that emerges across all experiments is that conditions with a greater proportion of efficient saccades are associated with a longer latency to initiate saccades, suggesting that the choice of informative locations requires deliberate planning.

[1]  Preeti Verghese Active search for multiple targets is inefficient , 2010 .

[2]  C. Koch,et al.  A saliency-based search mechanism for overt and covert shifts of visual attention , 2000, Vision Research.

[3]  A. L. Yarbus,et al.  Eye Movements and Vision , 1967, Springer US.

[4]  Preeti Verghese,et al.  Immediate feedback improves saccadic efficiency , 2013 .

[5]  Miguel P Eckstein,et al.  Saccadic and perceptual performance in visual search tasks. I. Contrast detection and discrimination. , 2003, Journal of the Optical Society of America. A, Optics, image science, and vision.

[6]  Jiri Najemnik,et al.  Eye movement statistics in humans are consistent with an optimal search strategy. , 2008, Journal of vision.

[7]  Rajesh P. N. Rao,et al.  Eye movements in iconic visual search , 2002, Vision Research.

[8]  Michael S Landy,et al.  Choice of saccade endpoint under risk. , 2013, Journal of vision.

[9]  A. Raftery Bayesian Model Selection in Social Research , 1995 .

[10]  Miguel P Eckstein,et al.  The time course of visual information accrual guiding eye movement decisions. , 2004, Proceedings of the National Academy of Sciences of the United States of America.

[11]  G. Legge,et al.  Mr. Chips: An ideal-observer model of reading , 1997 .

[12]  A. L. I︠A︡rbus Eye Movements and Vision , 1967 .

[13]  Martin Rolfs,et al.  Stimulus competition mediates the joint effects of spatial and feature-based attention. , 2015, Journal of vision.

[14]  Eileen Kowler,et al.  Eye movements during visual search: the costs of choosing the optimal path , 2001, Vision Research.

[15]  David R. Anderson,et al.  Model selection and multimodel inference : a practical information-theoretic approach , 2003 .

[16]  M. Carrasco,et al.  Characterizing visual performance fields: effects of transient covert attention, spatial frequency, eccentricity, task and set size. , 2001, Spatial vision.

[17]  Vision Research , 1961, Nature.

[18]  Matthew F. Peterson,et al.  Looking just below the eyes is optimal across face recognition tasks , 2012, Proceedings of the National Academy of Sciences.

[19]  Preeti Verghese,et al.  Stop before you saccade: Looking into an artificial peripheral scotoma. , 2015, Journal of vision.

[20]  Pierre Baldi,et al.  Bayesian surprise attracts human attention , 2005, Vision Research.

[21]  G. Legge,et al.  Mr. Chips: an ideal-observer model of reading. , 1997, Psychological review.

[22]  G. Legge,et al.  Mr. Chips 2002: new insights from an ideal-observer model of reading , 2002, Vision Research.

[23]  Karl R Gegenfurtner,et al.  Effects of salience and reward information during saccadic decisions under risk. , 2009, Journal of the Optical Society of America. A, Optics, image science, and vision.

[24]  Alan C Bovik,et al.  Contrast statistics for foveated visual systems: fixation selection by minimizing contrast entropy. , 2005, Journal of the Optical Society of America. A, Optics, image science, and vision.

[25]  H. Basford,et al.  Optimal eye movement strategies in visual search , 2005 .

[26]  Alexander C. Schütz,et al.  Dynamic integration of information about salience and value for saccadic eye movements , 2012, Proceedings of the National Academy of Sciences.

[27]  M. Peruggia Model Selection and Multimodel Inference: A Practical Information-Theoretic Approach (2nd ed.) , 2003 .

[28]  Thomas M. Cover,et al.  Elements of Information Theory , 2005 .

[29]  M. Donk,et al.  No control in orientation search: The effects of instruction on oculomotor selection in visual search , 2011, Vision Research.

[30]  Tai Sing Lee,et al.  An Information-Theoretic Framework for Understanding Saccadic Eye Movements , 1999, NIPS.

[31]  Preeti Verghese,et al.  Where to look next? Eye movements reduce local uncertainty. , 2007, Journal of vision.

[32]  Hang Zhang,et al.  Gambling in the Visual Periphery: A Conjoint-Measurement Analysis of Human Ability to Judge Visual Uncertainty , 2010, PLoS Comput. Biol..

[33]  T. Sejnowski,et al.  Learning where to look for a hidden target , 2013, Proceedings of the National Academy of Sciences.

[34]  Wilson S. Geisler,et al.  Optimal eye movement strategies in visual search , 2005, Nature.

[35]  G. Zelinsky A theory of eye movements during target acquisition. , 2008, Psychological review.

[36]  Javier R. Movellan,et al.  Infomax Control of Eye Movements , 2010, IEEE Transactions on Autonomous Mental Development.