Pip and pop: nonspatial auditory signals improve spatial visual search.

Searching for an object within a cluttered, continuously changing environment can be a very time-consuming process. The authors show that a simple auditory pip drastically decreases search times for a synchronized visual object that is normally very difficult to find. This effect occurs even though the pip contains no information on the location or identity of the visual object. The experiments also show that the effect is not due to general alerting (because it does not occur with visual cues), nor is it due to top-down cuing of the visual change (because it still occurs when the pip is synchronized with distractors on the majority of trials). Instead, we propose that the temporal information of the auditory signal is integrated with the visual signal, generating a relatively salient emergent feature that automatically draws attention. Phenomenally, the synchronous pip makes the visual object pop out from its complex environment, providing a direct demonstration of spatially nonspecific sounds affecting competition in spatial visual processing.

[1]  Sander A. Los,et al.  The effective time course of preparation , 2008, Cognitive Psychology.

[2]  Daniel Senkowski,et al.  Good times for multisensory integration: Effects of the precision of temporal synchrony as revealed by gamma-band oscillations , 2007, Neuropsychologia.

[3]  C. Spence,et al.  Attentional capture in serial audiovisual search tasks , 2007, Perception & psychophysics.

[4]  M. Woldorff,et al.  Selective attention and audiovisual integration: is attending to both modalities a prerequisite for early integration? , 2006, Cerebral cortex.

[5]  Shin'ya Nishida,et al.  Visual search for a target changing in synchrony with an auditory signal , 2006, Proceedings of the Royal Society B: Biological Sciences.

[6]  Andrew B. Leber,et al.  It’s under control: Top-down search strategies can override attentional capture , 2006, Psychonomic bulletin & review.

[7]  James T. Enns,et al.  Unique Temporal Change Is the Key to Attentional Capture , 2005, Psychological science.

[8]  John J. Foxe,et al.  Multisensory contributions to low-level, ‘unisensory’ processing , 2005, Current Opinion in Neurobiology.

[9]  Waka Fujisaki,et al.  Temporal frequency characteristics of synchrony–asynchrony discrimination of audio-visual signals , 2005, Experimental Brain Research.

[10]  R. Campbell,et al.  Audiovisual Integration of Speech Falters under High Attention Demands , 2005, Current Biology.

[11]  C. Bundesen,et al.  A neural theory of visual attention: bridging cognition and neurophysiology. , 2005, Psychological review.

[12]  S. Hackley,et al.  Which stages of processing are speeded by a warning signal? , 2003, Biological Psychology.

[13]  J. Lewald,et al.  Cross-modal perceptual integration of spatially and temporally disparate auditory and visual stimuli. , 2003, Brain research. Cognitive brain research.

[14]  H. Kennedy,et al.  Anatomical Evidence of Multimodal Integration in Primate Striate Cortex , 2002, The Journal of Neuroscience.

[15]  John J. Foxe,et al.  Multisensory auditory-visual interactions during early sensory processing in humans: a high-density electrical mapping study. , 2002, Brain research. Cognitive brain research.

[16]  Giovanni Galfano,et al.  Nonspatial attentional shifts between audition and vision. , 2002, Journal of experimental psychology. Human perception and performance.

[17]  Jörg Lewald,et al.  Spatio-temporal constraints for auditory–visual integration , 2001, Behavioural Brain Research.

[18]  S. Los,et al.  Intentional and unintentional contributions to nonspecific preparation during reaction time foreperiods. , 2001, Journal of experimental psychology. Human perception and performance.

[19]  S. Hillyard,et al.  Involuntary orienting to sound improves visual perception , 2000, Nature.

[20]  J. Vroomen,et al.  Sound enhances visual perception: cross-modal effects of auditory organization on vision. , 2000, Journal of experimental psychology. Human perception and performance.

[21]  Robert S. Bolia,et al.  Aurally Aided Visual Search in Three-Dimensional Space , 1999, Hum. Factors.

[22]  M. Giard,et al.  Auditory-Visual Integration during Multimodal Object Recognition in Humans: A Behavioral and Electrophysiological Study , 1999, Journal of Cognitive Neuroscience.

[23]  M. Wallace,et al.  Representation and integration of multiple sensory inputs in primate superior colliculus. , 1996, Journal of neurophysiology.

[24]  M. Masson,et al.  Using confidence intervals in within-subject designs , 1994, Psychonomic bulletin & review.

[25]  J. Wolfe,et al.  Guided Search 2.0 A revised model of visual search , 1994, Psychonomic bulletin & review.

[26]  J. Theeuwes Perceptual selectivity for color and form , 1992, Perception & psychophysics.

[27]  T Z Strybel,et al.  Aurally Aided Visual Search in the Central Visual Field: Effects of Visual Load and Visual Enhancement of the Target , 1991, Human factors.

[28]  K. Saberi,et al.  Auditory psychomotor coordination and visual search performance , 1990, Perception & psychophysics.

[29]  S. Yantis,et al.  Abrupt visual onsets and selective attention: voluntary versus automatic allocation. , 1990, Journal of experimental psychology. Human perception and performance.

[30]  S. Yantis,et al.  Abrupt visual onsets and selective attention: evidence from visual search. , 1984, Journal of experimental psychology. Human perception and performance.

[31]  D. H. Warren,et al.  Immediate perceptual response to intersensory discrepancy. , 1980, Psychological bulletin.

[32]  A. Treisman,et al.  A feature-integration theory of attention , 1980, Cognitive Psychology.

[33]  Stephen J. Boies,et al.  Components of attention. , 1971 .

[34]  P. Bertelson Quarterly Journal of Experimental Psychology the Time Course of Prepration the Time Course of Preparation* , 2022 .

[35]  T SHIPLEY,et al.  Auditory Flutter-Driving of Visual Flicker , 1964, Science.

[36]  John J. Foxe,et al.  Multisensory Convergence in Early Cortical Processing. , 2004 .

[37]  C. Spence,et al.  The Handbook of Multisensory Processing , 2004 .

[38]  J. Driver,et al.  Audiovisual links in exogenous covert spatial orienting , 1997, Perception & psychophysics.

[39]  R. Desimone,et al.  Neural mechanisms of selective visual attention. , 1995, Annual review of neuroscience.

[40]  Jan Theeuwes,et al.  SEARCH FOR A CONJUNCTIVELY DEFINED TARGET CAN BE SELECTIVELY LIMITED TO A COLOR-DEFINED SUBSET OF ELEMENTS , 1995 .

[41]  J. Theeuwes Exogenous and endogenous control of attention: The effect of visual onsets and offsets , 1991, Perception & psychophysics.

[42]  A. Treisman,et al.  Conjunction search revisited. , 1990, Journal of experimental psychology. Human perception and performance.

[43]  Piotr Jaśkowski,et al.  Temporal-order judgments and reaction time for stimuli of different modalities , 1990, Psychological research.

[44]  P. J. Foley The foreperiod and simple reaction time. , 1959, Canadian journal of psychology.