A delay in sampling information from temporally autocorrelated visual stimuli

Much of our world changes smoothly in time, yet the allocation of attention is typically studied with sudden changes – transients. When stimuli change gradually there is a sizeable lag between when a cue is presented and when an object is sampled (Carlson, Hogendoorn, & Verstraten, 2006; Sheth, Nijhawan & Shimojo, 2000). Yet this lag is not seen with rapid serial visual presentation (RSVP) stimuli where temporally uncorrelated stimuli are presented (Vul, Kanwisher & Nieuwenstein 2008; Goodbourn & Holcombe, 2015). These findings collectively suggest that temporal autocorrelation of a feature paradoxically increases the latency at which information is sampled. This hypothesis was tested by comparing stimuli changing smoothly in time (autocorrelated) to stimuli that change randomly. Participants attempted to report the color coincident with a visual cue. The result was a smaller selection lag for the randomly varying condition relative to the condition with a smooth color trajectory. Our third experiment finds that the increase in selection latency is due to the smoothness of the color change after the cue rather than extrapolated predictions based on the color changes presented before the cue. Together, these results support a theory of attentional drag, whereby attention remains engaged at a location longer when features are changing smoothly. A computational model provides insights into neural mechanisms that might underlie the effect.

[1]  Robert Oostenveld,et al.  Neural Mechanisms of Visual Attention : How Top-Down Feedback Highlights Relevant Locations , 2007 .

[2]  H J Müller,et al.  Spatial Cueing and the Relation between the Accuracy of “Where” and “What” Decisions in Visual Search , 1989, The Quarterly journal of experimental psychology. A, Human experimental psychology.

[3]  M. Nieuwenstein,et al.  The attentional blink provides episodic distinctiveness: sparing at a cost. , 2009, Journal of experimental psychology. Human perception and performance.

[4]  S. Nishida,et al.  A common perceptual temporal limit of binding synchronous inputs across different sensory attributes and modalities , 2010, Proceedings of the Royal Society B: Biological Sciences.

[5]  N. Kanwisher,et al.  Temporal Selection is Suppressed, Delayed, and Diffused During the Attentional Blink , 2008, Psychological science.

[6]  Frans A. J. Verstraten,et al.  The speed of visual attention: what time is it? , 2006, Journal of vision.

[7]  R. O’Reilly,et al.  Computational Explorations in Cognitive Neuroscience: Understanding the Mind by Simulating the Brain , 2000 .

[8]  Kai M Schreiber,et al.  The extended horopter: quantifying retinal correspondence across changes of 3D eye position. , 2006, Journal of vision.

[9]  D H Brainard,et al.  The Psychophysics Toolbox. , 1997, Spatial vision.

[10]  Rufin VanRullen,et al.  Bullet trains and steam engines: exogenous attention zips but endogenous attention chugs along. , 2011, Journal of vision.

[11]  S. Nishida,et al.  Marker Correspondence, Not Processing Latency, Determines Temporal Binding of Visual Attributes , 2002, Current Biology.

[12]  P. Cavanagh,et al.  Independent, synchronous access to color and motion features , 2008, Cognition.

[13]  S. Yantis,et al.  Uniqueness of abrupt visual onset in capturing attention , 1988, Perception & psychophysics.

[14]  K. Nakayama,et al.  Sustained and transient components of focal visual attention , 1989, Vision Research.

[15]  Thomas A Carlson,et al.  Timing divided attention. , 2010, Attention, perception & psychophysics.

[16]  Shinsuke Shimojo,et al.  Changing objects lead briefly flashed ones , 2000, Nature Neuroscience.

[17]  T. Sejnowski,et al.  Representation of Color Stimuli in Awake Macaque Primary Visual Cortex , 2003, Neuron.

[18]  Alex O Holcombe,et al.  "Pseudoextinction": asymmetries in simultaneous attentional selection. , 2015, Journal of experimental psychology. Human perception and performance.

[19]  I. Motoyoshi,et al.  Temporal resolution of orientation-based texture segregation , 2001, Vision Research.

[20]  Jeffrey M. Zacks,et al.  Event perception: a mind-brain perspective. , 2007, Psychological bulletin.

[21]  Jeffrey M. Zacks,et al.  Constructing Experience: Event Models from Perception to Action , 2017, Trends in Cognitive Sciences.

[22]  Patrick Monnier,et al.  Searching for variegated elements. , 2011, Journal of vision.

[23]  A. Holcombe Seeing slow and seeing fast: two limits on perception , 2009, Trends in Cognitive Sciences.