Searching for Something Familiar or Novel: Top–Down Attentional Selection of Specific Items or Object Categories

Visual search is often guided by top–down attentional templates that specify target-defining features. But search can also occur at the level of object categories. We measured the N2pc component, a marker of attentional target selection, in two visual search experiments where targets were defined either categorically (e.g., any letter) or at the item level (e.g., the letter C) by a prime stimulus. In both experiments, an N2pc was elicited during category search, in both familiar and novel contexts (Experiment 1) and with symbolic primes (Experiment 2), indicating that, even when targets are only defined at the category level, they are selected at early sensory-perceptual stages. However, the N2pc emerged earlier and was larger during item-based search compared with category-based search, demonstrating the superiority of attentional guidance by item-specific templates. We discuss the implications of these findings for attentional control and category learning.

[1]  Valerie M. Beck,et al.  Simultaneous Control of Attention by Multiple Working Memory Representations , 2012, Psychological science.

[2]  Jeff Miller,et al.  Jackknife-based method for measuring LRP onset latency differences. , 1998, Psychophysiology.

[3]  Howard E. Egeth,et al.  Parallel processing of multielement displays , 1972 .

[4]  A. Treisman,et al.  A feature-integration theory of attention , 1980, Cognitive Psychology.

[5]  George L. Malcolm,et al.  The effects of target template specificity on visual search in real-world scenes: evidence from eye movements. , 2009, Journal of vision.

[6]  Martin Eimer,et al.  Qualitative differences in the guidance of attention during single-color and multiple-color visual search: behavioral and electrophysiological evidence. , 2013, Journal of experimental psychology. Human perception and performance.

[7]  R. Desimone,et al.  Neural mechanisms of selective visual attention. , 1995, Annual review of neuroscience.

[8]  Geoffrey F. Woodman,et al.  Attentional Templates in Visual Working Memory , 2011, The Journal of Neuroscience.

[9]  R. Desimone,et al.  Neural mechanisms of spatial selective attention in areas V1, V2, and V4 of macaque visual cortex. , 1997, Journal of neurophysiology.

[10]  Geoffrey F. Woodman,et al.  When Memory Is Not Enough: Electrophysiological Evidence for Goal-dependent Use of Working Memory Representations in Guiding Visual Attention , 2011, Journal of Cognitive Neuroscience.

[11]  Jöran Lepsien,et al.  Purely endogenous capture of attention by task-defining features proceeds independently from spatial attention , 2010, NeuroImage.

[12]  Pieter R Roelfsema,et al.  Matching of visual input to only one item at any one time , 2009, Psychological research.

[13]  David J. Freedman,et al.  Categorical representation of visual stimuli in the primate prefrontal cortex. , 2001, Science.

[14]  Pierre Jolicoeur,et al.  Tracking the Location of Visuospatial Attention in a Contingent Capture Paradigm , 2008, Journal of Cognitive Neuroscience.

[15]  Martin Eimer,et al.  Activation of New Attentional Templates for Real-world Objects in Visual Search , 2015, Journal of Cognitive Neuroscience.

[16]  Maro G. Machizawa,et al.  Neural activity predicts individual differences in visual working memory capacity , 2004, Nature.

[17]  J. Theeuwes,et al.  Feature-based memory-driven attentional capture: visual working memory content affects visual attention. , 2006, Journal of experimental psychology. Human perception and performance.

[18]  Martin Eimer,et al.  Does focused endogenous attention prevent attentional capture in pop-out visual search? , 2009, Psychophysiology.

[19]  M. Eimer The N2pc component as an indicator of attentional selectivity. , 1996, Electroencephalography and clinical neurophysiology.

[20]  D. Scott Perceptual learning. , 1974, Queen's nursing journal.

[21]  P. Roelfsema,et al.  Different States in Visual Working Memory: When It Guides Attention and When It Does Not , 2022 .

[22]  Martin Eimer,et al.  What top-down task sets do for us: an ERP study on the benefits of advance preparation in visual search. , 2011, Journal of experimental psychology. Human perception and performance.

[23]  Robert G Alexander,et al.  Visual similarity effects in categorical search. , 2011, Journal of vision.

[24]  Gregory J. Zelinsky,et al.  Visual search is guided to categorically-defined targets , 2009, Vision Research.

[25]  J. Duncan,et al.  Visual search and stimulus similarity. , 1989, Psychological review.

[26]  Scott P. Johnson,et al.  Learning by selection: visual search and object perception in young infants. , 2006, Developmental psychology.

[27]  F. Hamker,et al.  Patients with Parkinson׳s disease are less affected than healthy persons by relevant response-unrelated features in visual search , 2014, Neuropsychologia.

[28]  S. Luck,et al.  Neural sources of focused attention in visual search. , 2000, Cerebral cortex.

[29]  S J Luck,et al.  Spatial filtering during visual search: evidence from human electrophysiology. , 1994, Journal of experimental psychology. Human perception and performance.

[30]  Jessica L. Irons,et al.  All set! Evidence of simultaneous attentional control settings for multiple target colors. , 2012, Journal of experimental psychology. Human perception and performance.

[31]  Gregory J. Zelinsky,et al.  Effects of part-based similarity on visual search: The Frankenbear experiment , 2012, Vision Research.

[32]  N. Kirkham,et al.  No two cues are alike: Depth of learning during infancy is dependent on what orients attention. , 2010, Journal of experimental child psychology.