Learning rational temporal eye movement strategies

Significance In a dynamic world humans not only have to decide where to look but also when to direct their gaze to potentially informative locations in the visual scene. Little is known about how timing of eye movements is related to environmental regularities and how gaze strategies are learned. Here we present behavioral data establishing that humans learn to adjust their temporal eye movements efficiently. Our computational model shows how established properties of the visual system determine the timing of gaze. Surprisingly, a Bayesian learner only incorporating the scalar law of biological timing can fully explain the course of learning these strategies. Thus, humans use temporal regularities learned from observations to adjust the scheduling of eye movements in a nearly optimal way. During active behavior humans redirect their gaze several times every second within the visual environment. Where we look within static images is highly efficient, as quantified by computational models of human gaze shifts in visual search and face recognition tasks. However, when we shift gaze is mostly unknown despite its fundamental importance for survival in a dynamic world. It has been suggested that during naturalistic visuomotor behavior gaze deployment is coordinated with task-relevant events, often predictive of future events, and studies in sportsmen suggest that timing of eye movements is learned. Here we establish that humans efficiently learn to adjust the timing of eye movements in response to environmental regularities when monitoring locations in the visual scene to detect probabilistically occurring events. To detect the events humans adopt strategies that can be understood through a computational model that includes perceptual and acting uncertainties, a minimal processing time, and, crucially, the intrinsic costs of gaze behavior. Thus, subjects traded off event detection rate with behavioral costs of carrying out eye movements. Remarkably, based on this rational bounded actor model the time course of learning the gaze strategies is fully explained by an optimal Bayesian learner with humans’ characteristic uncertainty in time estimation, the well-known scalar law of biological timing. Taken together, these findings establish that the human visual system is highly efficient in learning temporal regularities in the environment and that it can use these regularities to control the timing of eye movements to detect behaviorally relevant events.

[1]  R. Johansson,et al.  Action plans used in action observation , 2003, Nature.

[2]  M. Wittmann The inner sense of time: how the brain creates a representation of duration , 2013, Nature Reviews Neuroscience.

[3]  Alexander C. Schütz,et al.  Dynamic integration of information about salience and value for saccadic eye movements , 2012, Proceedings of the National Academy of Sciences.

[4]  Denis Fize,et al.  Speed of processing in the human visual system , 1996, Nature.

[5]  P. Subramanian Active Vision: The Psychology of Looking and Seeing , 2006 .

[6]  J. Gottlieb Attention, Learning, and the Value of Information , 2012, Neuron.

[7]  Matthew F. Peterson,et al.  Looking just below the eyes is optimal across face recognition tasks , 2012, Proceedings of the National Academy of Sciences.

[8]  Ali Borji,et al.  What stands out in a scene? A study of human explicit saliency judgment , 2013, Vision Research.

[9]  A. L. I︠A︡rbus Eye Movements and Vision , 1967 .

[10]  Michael F. Land,et al.  From eye movements to actions: how batsmen hit the ball , 2000, Nature Neuroscience.

[11]  D. Ballard,et al.  Modeling Task Control of Eye Movements , 2014, Current Biology.

[12]  D. Ballard,et al.  Memory Representations in Natural Tasks , 1995, Journal of Cognitive Neuroscience.

[13]  A. L. Yarbus,et al.  Eye Movements and Vision , 1967, Springer US.

[14]  W. Geisler Sequential ideal-observer analysis of visual discriminations. , 1989, Psychological review.

[15]  Ralf Engbert,et al.  ICAT: a computational model for the adaptive control of fixation durations , 2014, Psychonomic bulletin & review.

[16]  David R. Anderson,et al.  Multimodel Inference , 2004 .

[17]  Scott Cheng-Hsin Yang,et al.  Active sensing in the categorization of visual patterns , 2016, eLife.

[18]  Mary M Hayhoe,et al.  Task and context determine where you look. , 2016, Journal of vision.

[19]  R. Johansson,et al.  Eye–Hand Coordination in Object Manipulation , 2001, The Journal of Neuroscience.

[20]  R. Ratcliff Group reaction time distributions and an analysis of distribution statistics. , 1979, Psychological bulletin.

[21]  Michael N. Shadlen,et al.  Temporal context calibrates interval timing , 2010, Nature Neuroscience.

[22]  Antonio Torralba,et al.  Contextual guidance of eye movements and attention in real-world scenes: the role of global features in object search. , 2006, Psychological review.

[23]  Mary Hayhoe,et al.  Saccades to future ball location reveal memory-based prediction in a virtual-reality interception task. , 2013, Journal of vision.

[24]  E. Salinas,et al.  Perceptual decision making in less than 30 milliseconds , 2010, Nature Neuroscience.

[25]  John E. Schlerf,et al.  Dedicated and intrinsic models of time perception , 2008, Trends in Cognitive Sciences.

[26]  M. Landy,et al.  The effect of viewpoint on perceived visual roughness. , 2007, Journal of vision.

[27]  E. Wagenmakers,et al.  AIC model selection using Akaike weights , 2004, Psychonomic bulletin & review.

[28]  Laurence T. Maloney,et al.  Human Visual Search Does Not Maximize the Post-Saccadic Probability of Identifying Targets , 2012, PLoS Comput. Biol..

[29]  Hugo Merchant,et al.  Neural basis of the perception and estimation of time. , 2013, Annual review of neuroscience.

[30]  Ralf Engbert,et al.  Control of fixation duration during scene viewing by interaction of foveal and peripheral processing. , 2013, Journal of vision.

[31]  S. Thorpe,et al.  A Limit to the Speed of Processing in Ultra-Rapid Visual Categorization of Novel Natural Scenes , 2001, Journal of Cognitive Neuroscience.

[32]  Pietro Perona,et al.  Optimal reward harvesting in complex perceptual environments , 2010, Proceedings of the National Academy of Sciences.

[33]  Amelia R Hunt,et al.  Failure of Intuition When Choosing Whether to Invest in a Single Goal or Split Resources Between Two Goals , 2016, Psychological science.

[34]  Jeremy M Wolfe,et al.  What are the shapes of response time distributions in visual search? , 2011, Journal of experimental psychology. Human perception and performance.

[35]  T Maloney Laurence Suboptimal selection of initial saccade in a visual search task , 2009 .

[36]  H. Pashler,et al.  What determines saccade timing in sequences of coordinated eye and hand movements? , 2011, Psychonomic bulletin & review.

[37]  D. Ballard,et al.  Eye movements in natural behavior , 2005, Trends in Cognitive Sciences.

[38]  M. Wiener,et al.  Animal eyes. , 1957, The American orthoptic journal.

[39]  B. Tatler,et al.  Looking and Acting: Vision and eye movements in natural behaviour , 2009 .

[40]  Steven G. Luke,et al.  Eye movement control in scene viewing and reading: evidence from the stimulus onset delay paradigm. , 2013, Journal of experimental psychology. Human perception and performance.

[41]  Chris R Sims,et al.  Adaptive Allocation of Vision under Competing Task Demands , 2011, The Journal of Neuroscience.

[42]  J. Gibbon Scalar expectancy theory and Weber's law in animal timing. , 1977 .

[43]  Adrian Staub,et al.  The effect of lexical predictability on distributions of eye fixation durations , 2011, Psychonomic bulletin & review.

[44]  Dirk Kerzel,et al.  Temporal stimulus properties that attract gaze to the periphery and repel gaze from fixation. , 2013, Journal of vision.

[45]  Shin'ya Nishida,et al.  Direction of visual apparent motion driven by perceptual organization of cross-modal signals. , 2013, Journal of vision.

[46]  C. Koch,et al.  Computational modelling of visual attention , 2001, Nature Reviews Neuroscience.

[47]  T. Sejnowski,et al.  Learning where to look for a hidden target , 2013, Proceedings of the National Academy of Sciences.

[48]  Wilson S. Geisler,et al.  Optimal eye movement strategies in visual search , 2005, Nature.

[49]  Eero P. Simoncelli,et al.  Natural image statistics and neural representation. , 2001, Annual review of neuroscience.

[50]  Michael S. Landy,et al.  Optimal Compensation for Temporal Uncertainty in Movement Planning , 2008, PLoS Comput. Biol..

[51]  J. Henderson Human gaze control during real-world scene perception , 2003, Trends in Cognitive Sciences.