Five factors that guide attention in visual search

How do we find what we are looking for? Even when the desired target is in the current field of view, we need to search because fundamental limits on visual processing make it impossible to recognize everything at once. Searching involves directing attention to objects that might be the target. This deployment of attention is not random. It is guided to the most promising items and locations by five factors discussed here: bottom-up salience, top-down feature guidance, scene structure and meaning, the previous history of search over timescales ranging from milliseconds to years, and the relative value of the targets and distractors. Modern theories of visual search need to incorporate all five factors and specify how these factors combine to shape search behaviour. An understanding of the rules of guidance can be used to improve the accuracy and efficiency of socially important search tasks, from security screening to medical image perception.

[1]  John B. Shoven,et al.  I , Edinburgh Medical and Surgical Journal.

[2]  L. Kaufman,et al.  “Center-of-gravity” Tendencies for fixations and flow patterns , 1969 .

[3]  U. Frith Acurious effect with reversed letters explained by a theory of schema , 1974 .

[4]  A. Treisman,et al.  A feature-integration theory of attention , 1980, Cognitive Psychology.

[5]  I. Biederman,et al.  Scene perception: Detecting and judging objects undergoing relational violations , 1982, Cognitive Psychology.

[6]  L. E. Krueger The category effect in visual search depends on physical rather than conceptual differences , 1984, Perception & psychophysics.

[7]  H. Egeth,et al.  Searching for conjunctively defined targets. , 1984, Journal of experimental psychology. Human perception and performance.

[8]  S Ullman,et al.  Shifts in selective visual attention: towards the underlying neural circuitry. , 1985, Human neurobiology.

[9]  J. H. Flowers,et al.  How does familiarity affect visual search for letter strings? , 1985, Perception & psychophysics.

[10]  D. Badcock,et al.  Processing feature density in preattentive perception , 1988, Perception & psychophysics.

[11]  Susan L. Franzel,et al.  Binocularity and visual search , 1988, Perception & psychophysics.

[12]  J. Duncan,et al.  Visual search and stimulus similarity. , 1989, Psychological review.

[13]  Susan L. Franzel,et al.  Guided search: an alternative to the feature integration model for visual search. , 1989, Journal of experimental psychology. Human perception and performance.

[14]  Ronald A. Rensink,et al.  Influence of scene-based properties on visual search. , 1990, Science.

[15]  J. Henderson Object identification in context: the visual processing of natural scenes. , 1992, Canadian journal of psychology.

[16]  I Kovács,et al.  A closed curve is much more than an incomplete one: effect of closure in figure-ground segmentation. , 1993, Proceedings of the National Academy of Sciences of the United States of America.

[17]  K. Nakayama,et al.  Priming of pop-out: I. Role of features , 1994, Memory & cognition.

[18]  P Cavanagh,et al.  Familiarity and pop-out in visual search , 1994, Perception & psychophysics.

[19]  J. Wolfe,et al.  Second-order parallel processing: visual search for the odd item in a subset. , 1995, Journal of experimental psychology. Human perception and performance.

[20]  M. V. von Grünau,et al.  The Detection of Gaze Direction: A Stare-In-The-Crowd Effect , 1995, Perception.

[21]  Deborah J. Aks,et al.  Visual search for size is influenced by a background texture gradient. , 1996, Journal of experimental psychology. Human perception and performance.

[22]  G. Humphreys,et al.  Visual marking: prioritizing selection for new objects by top-down attentional inhibition of old objects. , 1997, Psychological review.

[23]  J. Wolfe,et al.  PSYCHOLOGICAL SCIENCE Research Article WHAT CAN 1 MILLION TRIALS TELL US ABOUT VISUAL SEARCH? , 2022 .

[24]  M. Chun,et al.  Contextual Cueing: Implicit Learning and Memory of Visual Context Guides Spatial Attention , 1998, Cognitive Psychology.

[25]  J. Henderson,et al.  High-level scene perception. , 1999, Annual review of psychology.

[26]  M. Chun,et al.  Top-Down Attentional Guidance Based on Implicit Learning of Visual Covariation , 1999 .

[27]  Ronald A. Rensink Seeing, sensing, and scrutinizing , 2000, Vision Research.

[28]  C. Koch,et al.  A saliency-based search mechanism for overt and covert shifts of visual attention , 2000, Vision Research.

[29]  J. Wolfe,et al.  Post-attentive vision , 2001 .

[30]  A. Kramer,et al.  Attentional guidance of the eyes by contextual information and abrupt onsets. , 2001, Perception & psychophysics.

[31]  J. Henderson,et al.  Accurate visual memory for previously attended objects in natural scenes , 2002 .

[32]  Jan Theeuwes,et al.  Prioritizing selection of new elements: Bottom-up versus top-down control , 2003, Perception & psychophysics.

[33]  Jeremy M Wolfe,et al.  Do Intersections Serve as Basic Features in Visual Search? , 2003, Perception.

[34]  Doug Gurian-Sherman,et al.  Competing interests , 2003, Nature Biotechnology.

[35]  J. Wolfe,et al.  Changing your mind: on the contributions of top-down and bottom-up guidance in visual search for feature singletons. , 2003, Journal of experimental psychology. Human perception and performance.

[36]  Bruno A Olshausen,et al.  Sparse coding of sensory inputs , 2004, Current Opinion in Neurobiology.

[37]  J. Townsend,et al.  The serial-parallel dilemma: A case study in a linkage of theory and method , 2004, Psychonomic bulletin & review.

[38]  J. Wolfe,et al.  What attributes guide the deployment of visual attention and how do they do it? , 2004, Nature Reviews Neuroscience.

[39]  Naomi M. Kenner,et al.  How fast can you change your mind? The speed of top-down guidance in visual search , 2004, Vision Research.

[40]  C. Li,et al.  Oculomotor correlates of context-guided learning in visual search , 2004, Perception & psychophysics.

[41]  Shaul Hochstein,et al.  At first sight: A high-level pop out effect for faces , 2005, Vision Research.

[42]  K. Fujii,et al.  Visualization for the analysis of fluid motion , 2005, J. Vis..

[43]  John K. Tsotsos,et al.  Neurobiology of Attention , 2005 .

[44]  Yuhong Jiang,et al.  Setting up the target template in visual search. , 2005, Journal of vision.

[45]  Á. Kristjánsson Simultaneous priming along multiple feature dimensions in a visual search task , 2006, Vision Research.

[46]  R. VanRullen On second glance: Still no high-level pop-out effect for faces , 2006, Vision Research.

[47]  S. Hochstein,et al.  With a careful look: Still no low-level confound to face pop-out , 2006, Vision Research.

[48]  James R. Brockmole,et al.  Using real-world scenes as contextual cues for search , 2006 .

[49]  J. Maunsell,et al.  Feature-based attention in visual cortex , 2006, Trends in Neurosciences.

[50]  Antonio Torralba,et al.  Contextual guidance of eye movements and attention in real-world scenes: the role of global features in object search. , 2006, Psychological review.

[51]  Melina A. Kunar,et al.  Does contextual cuing guide the deployment of attention? , 2007, Journal of experimental psychology. Human perception and performance.

[52]  Nao Ninomiya,et al.  The 10th anniversary of journal of visualization , 2007, J. Vis..

[53]  Jeremy M. Wolfe,et al.  Guided Search 4.0: Current Progress With a Model of Visual Search , 2007, Integrated Models of Cognitive Systems.

[54]  Frank E. Ritter,et al.  The Rise of Cognitive Architectures , 2007, Integrated Models of Cognitive Systems.

[55]  Wayne D. Gray Integrated Models of Cognitive Systems , 2007, Oxford series on cognitive models and architectures.

[56]  Wolfgang Nitsche,et al.  Infrared based visualization of wall shear stress distributions with a high temporal and spatial resolution , 2007, J. Vis..

[57]  Michael L. Mack,et al.  VISUAL SALIENCY DOES NOT ACCOUNT FOR EYE MOVEMENTS DURING VISUAL SEARCH IN REAL-WORLD SCENES , 2007 .

[58]  J. Henderson,et al.  Initial scene representations facilitate eye movement guidance in visual search. , 2007, Journal of experimental psychology. Human perception and performance.

[59]  A. Ali,et al.  scFv Antibody: Principles and Clinical Application , 2012, Clinical & developmental immunology.

[60]  Árni Kristjánsson,et al.  Priming in visual search: Separating the effects of target repetition, distractor repetition and role-reversal , 2008, Vision Research.

[61]  Daniel Smilek,et al.  Visual search for faces with emotional expressions. , 2008, Psychological bulletin.

[62]  Daniel Smilek,et al.  Visual search is not blind to emotion , 2008, Perception & psychophysics.

[63]  Gregory J. Zelinsky,et al.  Exploring set size effects in scenes: Identifying the objects of search , 2008 .

[64]  P. Perona,et al.  Objects predict fixations better than early saliency. , 2008, Journal of vision.

[65]  Tim K Marks,et al.  SUN: A Bayesian framework for saliency using natural statistics. , 2008, Journal of vision.

[66]  Jon Driver,et al.  Repetition streaks increase perceptual sensitivity in visual search of brief displays , 2008, Visual cognition.

[67]  Brian J. Scholl,et al.  The psychophysics of chasing: A case study in the perception of animacy , 2009, Cognitive Psychology.

[68]  G. Kuhn,et al.  Look away! Eyes and arrows engage oculomotor responses automatically , 2009, Attention, perception & psychophysics.

[69]  George L. Malcolm,et al.  Searching in the dark: Cognitive relevance drives attention in real-world scenes , 2009, Psychonomic bulletin & review.

[70]  A. Hollingworth Two forms of scene memory guide visual search: Memory for scene context and memory for the binding of target object to scene location , 2009 .

[71]  Christof Koch,et al.  A Model of Saliency-Based Visual Attention for Rapid Scene Analysis , 2009 .

[72]  I. Hyman,et al.  Did you see the unicycling clown? Inattentional blindness while walking and talking on a cell phone , 2009 .

[73]  Michelle R. Greene,et al.  PSYCHOLOGICAL SCIENCE Research Article The Briefest of Glances The Time Course of Natural Scene Understanding , 2022 .

[74]  Michelle R. Greene,et al.  The Briefest of Glances: The Time Course of Natural Scene Understanding , 2009 .

[75]  Jeremy M Wolfe,et al.  Fur in the midst of the waters: visual search for material type is inefficient. , 2010, Journal of vision.

[76]  John M Henderson,et al.  The time course of initial scene processing for eye movement guidance in natural scene search. , 2010, Journal of vision.

[77]  A. Schubö,et al.  Contextual cueing effects despite spatially cued target locations. , 2010, Psychophysiology.

[78]  George L. Malcolm,et al.  Combining top-down processes to guide eye movements during real-world scene search. , 2010, Journal of vision.

[79]  J. Henderson,et al.  Object-based attentional selection in scene viewing. , 2010, Journal of vision.

[80]  Thomas Geyer,et al.  Contextual cueing of pop-out visual search: when context guides the deployment of attention. , 2010, Journal of vision.

[81]  Jeremy M. Wolfe,et al.  Does contextual cueing guide the deployment of attention , 2010 .

[82]  Dominique Lamy,et al.  The role of search difficulty in intertrial feature priming , 2011, Vision Research.

[83]  Michael J Proulx,et al.  Does apparent size capture attention in visual search? Evidence from the Muller-Lyer illusion. , 2011, Journal of vision.

[84]  Ashley M. Sherman,et al.  Visual search for arbitrary objects in real scenes , 2011, Attention, perception & psychophysics.

[85]  J. Theeuwes,et al.  Signals of threat do not capture, but prioritize, attention: a conditioning approach. , 2011, Emotion.

[86]  M. Castelhano,et al.  Scene context influences without scene gist: Eye movements guided by spatial associations in visual search , 2011, Psychonomic bulletin & review.

[87]  Ronald A. Rensink The Management of Visual Attention in Graphic Displays the Management of Visual Attention in Graphic Displays 0. Visual Perception , 2022 .

[88]  Patryk A. Laurent,et al.  Value-driven attentional capture , 2011, Proceedings of the National Academy of Sciences.

[89]  D. Ballard,et al.  Eye guidance in natural vision: reinterpreting salience. , 2011, Journal of vision.

[90]  C. Paffen,et al.  A search asymmetry for interocular conflict , 2011, Attention, perception & psychophysics.

[91]  Liqiang Huang,et al.  Familiarity does not aid access to features , 2011, Psychonomic bulletin & review.

[92]  A. Schubö,et al.  Is contextual cueing more than the guidance of visual–spatial attention? , 2011, Biological Psychology.

[93]  Alex D. Hwang,et al.  Semantic guidance of eye movements in real-world scenes , 2011, Vision Research.

[94]  Derrick G Watson,et al.  Visual search in a multi-element asynchronous dynamic (MAD) world. , 2011, Journal of experimental psychology. Human perception and performance.

[95]  Simon P. Liversedge,et al.  The effect of the first glimpse at a scene on eye movements during search , 2012, Psychonomic bulletin & review.

[96]  James J. DiCarlo,et al.  How Does the Brain Solve Visual Object Recognition? , 2012, Neuron.

[97]  Krista A. Ehinger,et al.  Rethinking the Role of Top-Down Attention in Vision: Effects Attributable to a Lossy Representation in Peripheral Vision , 2011, Front. Psychology.

[98]  J. Wolfe,et al.  When does repeated search in scenes involve memory? Looking at versus looking for objects in scenes. , 2012, Journal of experimental psychology. Human perception and performance.

[99]  Shirama Aya Stare in the crowd: frontal face guides overt attention independently of its gaze direction. , 2012, Perception.

[100]  C. Paffen,et al.  Interocular conflict attracts attention , 2011, Attention, perception & psychophysics.

[101]  Serena Falasca,et al.  Laboratory simulations of an urban heat island in a stratified atmospheric boundary layer , 2013, J. Vis..

[102]  J. Wolfe,et al.  Differential Electrophysiological Signatures of Semantic and Syntactic Scene Processing , 2013, Psychological science.

[103]  James T Enns,et al.  The role of clarity and blur in guiding visual attention in photographs. , 2013, Journal of experimental psychology. Human perception and performance.

[104]  Marius Usher,et al.  Competitive guided search: meeting the challenge of benchmark RT distributions. , 2013, Journal of vision.

[105]  J. Wolfe,et al.  Coarse guidance by numerosity in visual search , 2013, Attention, perception & psychophysics.

[106]  Visual search for chasing objects among distractors , 2013 .

[107]  Hui Li,et al.  A unique visual rhythm does not pop out , 2013, Cognitive Processing.

[108]  S. Yantis,et al.  Persistence of value-driven attentional capture. , 2013, Journal of experimental psychology. Human perception and performance.

[109]  Wolfgang Einhäuser,et al.  Attention in natural scenes: contrast affects rapid visual processing and fixations alike , 2013, Philosophical Transactions of the Royal Society B: Biological Sciences.

[110]  Shenmin Zhang,et al.  What do saliency models predict? , 2014, Journal of vision.

[111]  A. Harel,et al.  Association and dissociation between detection and discrimination of objects of expertise: Evidence from visual search , 2014, Attention, perception & psychophysics.

[112]  Markus Huff,et al.  Perceptual animacy: visual search for chasing objects among distractors. , 2014, Journal of experimental psychology. Human perception and performance.

[113]  Sabine Kastner,et al.  The Oxford Handbook of Attention , 2014 .

[114]  Michael C. Hout,et al.  Visual similarity is stronger than semantic similarity in guiding visual search for numbers , 2014, Psychonomic bulletin & review.

[115]  Wilma Koutstaal,et al.  The hard-won benefits of familiarity in visual search: naturally familiar brand logos are found faster , 2014, Attention, perception & psychophysics.

[116]  Jeremy M Wolfe,et al.  Guided search for triple conjunctions , 2014, Attention, perception & psychophysics.

[117]  Claudia Roda Human Attention in Digital Environments , 2014 .

[118]  Melina A. Kunar,et al.  When are abrupt onsets found efficiently in complex visual search? Evidence from multielement asynchronous dynamic search. , 2014, Journal of experimental psychology. Human perception and performance.

[119]  Robert G Alexander,et al.  Are summary statistics enough? Evidence for the importance of shape in guiding visual search , 2014, Visual cognition.

[120]  J. Wolfe,et al.  The role of memory for visual search in scenes , 2015, Annals of the New York Academy of Sciences.

[121]  Corbin A. Cunningham,et al.  You look familiar, but I don't care: Lure rejection in hybrid visual and memory search is not based on familiarity. , 2015, Journal of experimental psychology. Human perception and performance.

[122]  John K. Tsotsos,et al.  On computational modeling of visual saliency: Examining what’s right, and what’s left , 2015, Vision Research.

[123]  Mary H. MacLean,et al.  Irrelevant reward and selection histories have different influences on task-relevant attentional selection , 2015, Attention, perception & psychophysics.

[124]  W. Einhäuser,et al.  Overt attention in natural scenes: Objects dominate features , 2015, Vision Research.

[125]  Rufin VanRullen,et al.  Attention searches nonuniformly in space and in time , 2015, Proceedings of the National Academy of Sciences.

[126]  Jie Huang,et al.  Cube search, revisited. , 2015, Journal of vision.

[127]  Árni Kristjánsson,et al.  Reconsidering Visual Search , 2015, i-Perception.

[128]  Krista A. Ehinger,et al.  How is visual search guided by shape? Using features from deep learning to understand preattentive shape space , 2016 .

[129]  Judith E. Fan,et al.  Incidental biasing of attention from visual long-term memory. , 2016, Journal of experimental psychology. Learning, memory, and cognition.

[130]  J. Wolfe,et al.  Differential ERP Signatures Elicited by Semantic and Syntactic Processing in Scenes , 2016 .

[131]  Jeremy M. Wolfe,et al.  Visual Search Revived: The Slopes Are Not That Slippery: A Reply to Kristjansson (2015) , 2016, i-Perception.

[132]  Deborah A. Cronin,et al.  Towards a better understanding of parallel visual processing in human vision: Evidence for exhaustive analysis of visual information. , 2016, Journal of experimental psychology. General.

[133]  R. Rosenholtz,et al.  Pooling of continuous features provides a unifying account of crowding , 2016, Journal of vision.

[134]  M. Meeter,et al.  Long-term priming of visual search prevails against the passage of time and counteracting instructions. , 2016, Journal of experimental psychology. Learning, memory, and cognition.

[135]  J. DiCarlo,et al.  Using goal-driven deep learning models to understand sensory cortex , 2016, Nature Neuroscience.

[136]  Marius Usher,et al.  Serial vs. parallel models of attention in visual search: accounting for benchmark RT-distributions , 2016, Psychonomic bulletin & review.

[137]  Manoj Kumar,et al.  The influence of semantics on the visual processing of natural scenes , 2017 .

[138]  Jeremy M Wolfe,et al.  Binocularity and visual search—Revisited , 2017, Attention, perception & psychophysics.

[139]  Philipp Koehn,et al.  Cognitive Psychology , 1992, Ageing and Society.