Contextual cost: When a visual-search target is not where it should be

Visual search is often facilitated when the search display occasionally repeats, revealing a contextual-cueing effect. According to the associative-learning account, contextual cueing arises from associating the display configuration with the target location. However, recent findings emphasizing the importance of local context near the target have given rise to the possibility that low-level repetition priming may account for the contextual-cueing effect. This study distinguishes associative learning from local repetition priming by testing whether search is directed toward a target's expected location, even when the target is relocated. After participants searched for a T among Ls in displays that repeated 24 times, they completed a transfer session where the target was relocated locally to a previously blank location (Experiment 1) or to an adjacent distractor location (Experiment 2). Results revealed that contextual cueing decreased as the target appeared farther away from its expected location, ultimately resulting in a contextual cost when the target swapped locations with a local distractor. We conclude that target predictability is a key factor in contextual cueing.

[1]  G. Logan Toward an instance theory of automatization. , 1988 .

[2]  David R. Wozny,et al.  Human trimodal perception follows optimal statistical inference. , 2008, Journal of vision.

[3]  Helga C. Arsenio,et al.  Panoramic search: the interaction of memory and vision in search through a familiar scene. , 2004, Journal of experimental psychology. Human perception and performance.

[4]  Yuhong Jiang,et al.  Connecting the past with the present: how do humans match an incoming visual display with visual memory? , 2005, Journal of vision.

[5]  Takatsune Kumada,et al.  Probing attentional modulation of contextual cueing , 2007 .

[6]  A. Kleinschmidt,et al.  The attentional field has a Mexican hat distribution , 2005, Vision Research.

[7]  J. Wolfe,et al.  Postattentive vision. , 2000, Journal of experimental psychology. Human perception and performance.

[8]  A. Kramer,et al.  Attentional guidance of the eyes by contextual information and abrupt onsets. , 2001, Perception & psychophysics.

[9]  Yuhong V. Jiang,et al.  What is learned in spatial contextual cuing— configuration or individual locations? , 2004, Perception & psychophysics.

[10]  J. Wolfe,et al.  Guided Search 2.0 A revised model of visual search , 1994, Psychonomic bulletin & review.

[11]  Timothy F. Brady,et al.  Spatial constraints on learning in visual search: modeling contextual cuing. , 2007, Journal of experimental psychology. Human perception and performance.

[12]  D G Pelli,et al.  The VideoToolbox software for visual psychophysics: transforming numbers into movies. , 1997, Spatial vision.

[13]  C. Tallon-Baudry,et al.  Unconscious associative memory affects visual processing before 100 ms. , 2008, Journal of vision.

[14]  Melina A. Kunar,et al.  Does contextual cuing guide the deployment of attention? , 2007, Journal of experimental psychology. Human perception and performance.

[15]  Melina A. Kunar,et al.  The role of memory and restricted context in repeated visual search , 2008, Perception & psychophysics.

[16]  J. Wolfe,et al.  When we use the context in contextual cueing: Evidence from multiple target locations , 2010 .

[17]  D H Brainard,et al.  The Psychophysics Toolbox. , 1997, Spatial vision.

[18]  Jeffrey S. Johnson,et al.  Implicit memory influences the allocation of attention in visual cortex , 2007, Psychonomic bulletin & review.

[19]  M. Chun,et al.  Perceptual constraints on implicit learning of spatial context , 2002 .

[20]  M. Chun,et al.  Contextual Cueing: Implicit Learning and Memory of Visual Context Guides Spatial Attention , 1998, Cognitive Psychology.

[21]  D. Homa,et al.  Reallocation of visual attention. , 1991, Journal of experimental psychology. Human perception and performance.

[22]  M. Coltheart,et al.  The quarterly journal of experimental psychology , 1985 .