Efficient Visual Search from Synchronized Auditory Signals Requires Transient Audiovisual Events

Background A prevailing view is that audiovisual integration requires temporally coincident signals. However, a recent study failed to find any evidence for audiovisual integration in visual search even when using synchronized audiovisual events. An important question is what information is critical to observe audiovisual integration. Methodology/Principal Findings Here we demonstrate that temporal coincidence (i.e., synchrony) of auditory and visual components can trigger audiovisual interaction in cluttered displays and consequently produce very fast and efficient target identification. In visual search experiments, subjects found a modulating visual target vastly more efficiently when it was paired with a synchronous auditory signal. By manipulating the kind of temporal modulation (sine wave vs. square wave vs. difference wave; harmonic sine-wave synthesis; gradient of onset/offset ramps) we show that abrupt visual events are required for this search efficiency to occur, and that sinusoidal audiovisual modulations do not support efficient search. Conclusions/Significance Thus, audiovisual temporal alignment will only lead to benefits in visual search if the changes in the component signals are both synchronized and transient. We propose that transient signals are necessary in synchrony-driven binding to avoid spurious interactions with unrelated signals when these occur close together in time.

[1]  G. J. Thomas Experimental study of the influence of vision on sound localization. , 1941 .

[2]  B. Stein,et al.  Spatial factors determine the activity of multisensory neurons in cat superior colliculus , 1986, Brain Research.

[3]  J. Rieger,et al.  Audiovisual Temporal Correspondence Modulates Human Multisensory Superior Temporal Sulcus Plus Primary Sensory Cortices , 2007, The Journal of Neuroscience.

[4]  Mark D'Esposito,et al.  Searching for “the Top” in Top-Down Control , 2005, Neuron.

[5]  S. Shimojo,et al.  Illusions: What you see is what you hear , 2000, Nature.

[6]  M. Wallace,et al.  Representation and integration of multiple sensory inputs in primate superior colliculus. , 1996, Journal of neurophysiology.

[7]  G. Calvert,et al.  Multisensory integration: methodological approaches and emerging principles in the human brain , 2004, Journal of Physiology-Paris.

[8]  H. McGurk,et al.  Hearing lips and seeing voices , 1976, Nature.

[9]  D. Burr,et al.  Combining visual and auditory information. , 2006, Progress in brain research.

[10]  B. Stein,et al.  Enhancement of Perceived Visual Intensity by Auditory Stimuli: A Psychophysical Analysis , 1996, Journal of Cognitive Neuroscience.

[11]  Linda Lundström,et al.  The pupils and optical systems of gecko eyes. , 2009, Journal of vision.

[12]  S. Nishida,et al.  Recalibration of audiovisual simultaneity , 2004, Nature Neuroscience.

[13]  E. Van der Burg,et al.  Audiovisual events capture attention: evidence from temporal order judgments. , 2008, Journal of vision.

[14]  T. Stanford,et al.  Multisensory integration: current issues from the perspective of the single neuron , 2008, Nature Reviews Neuroscience.

[15]  R. Campbell,et al.  Evidence from functional magnetic resonance imaging of crossmodal binding in the human heteromodal cortex , 2000, Current Biology.

[16]  J. Vroomen,et al.  Sound enhances visual perception: cross-modal effects of auditory organization on vision. , 2000, Journal of experimental psychology. Human perception and performance.

[17]  S. Zeki Functional specialisation in the visual cortex of the rhesus monkey , 1978, Nature.

[18]  Mikko Sams,et al.  Perceiving identical sounds as speech or non-speech modulates activity in the left posterior superior temporal sulcus , 2006, NeuroImage.

[19]  Daniel Senkowski,et al.  Good times for multisensory integration: Effects of the precision of temporal synchrony as revealed by gamma-band oscillations , 2007, Neuropsychologia.

[20]  Akihiro Yagi,et al.  Reduction of stimulus visibility compresses apparent time intervals , 2008, Nature Neuroscience.

[21]  Rainer Goebel,et al.  Top–down task effects overrule automatic multisensory responses to letter–sound pairs in auditory association cortex , 2006, NeuroImage.

[22]  John J. Foxe,et al.  The timing and laminar profile of converging inputs to multisensory areas of the macaque neocortex. , 2002, Brain research. Cognitive brain research.

[23]  M. Giard,et al.  Auditory-Visual Integration during Multimodal Object Recognition in Humans: A Behavioral and Electrophysiological Study , 1999, Journal of Cognitive Neuroscience.

[24]  Shin'ya Nishida,et al.  Visual search for a target changing in synchrony with an auditory signal , 2006, Proceedings of the Royal Society B: Biological Sciences.

[25]  A. Kingstone,et al.  Auditory capture of vision: examining temporal ventriloquism. , 2003, Brain research. Cognitive brain research.

[26]  P. Bertelson,et al.  The ventriloquist effect does not depend on the direction of automatic visual attention , 2001, Perception & psychophysics.

[27]  Waka Fujisaki,et al.  Temporal frequency characteristics of synchrony–asynchrony discrimination of audio-visual signals , 2005, Experimental Brain Research.

[28]  J. Theeuwes,et al.  Poke and pop: Tactile–visual synchrony increases visual saliency , 2009, Neuroscience Letters.

[29]  T SHIPLEY,et al.  Auditory Flutter-Driving of Visual Flicker , 1964, Science.

[30]  Rainer Goebel,et al.  The effect of temporal asynchrony on the multisensory integration of letters and speech sounds. , 2006, Cerebral cortex.

[31]  Sidney S. Simon,et al.  Merging of the Senses , 2008, Front. Neurosci..

[32]  S. Yantis,et al.  Abrupt visual onsets and selective attention: evidence from visual search. , 1984, Journal of experimental psychology. Human perception and performance.

[33]  C. Olivers,et al.  Bleeping you out of the blink: Sound saves vision from oblivion , 2008, Brain Research.

[34]  D. Burr,et al.  The Ventriloquist Effect Results from Near-Optimal Bimodal Integration , 2004, Current Biology.

[35]  J. Driver,et al.  Audiovisual links in exogenous covert spatial orienting , 1997, Perception & psychophysics.

[36]  Jan Theeuwes,et al.  Pip and pop: nonspatial auditory signals improve spatial visual search. , 2008, Journal of experimental psychology. Human perception and performance.

[37]  Shin'ya Nishida,et al.  The sliding window of audio-visual simultaneity. , 2009, Journal of vision.

[38]  Piotr Jaśkowski,et al.  Temporal-order judgments and reaction time for stimuli of different modalities , 1990, Psychological research.

[39]  H. Bülthoff,et al.  Merging the senses into a robust percept , 2004, Trends in Cognitive Sciences.

[40]  S. Nishida,et al.  Audio–tactile superiority over visuo–tactile and audio–visual combinations in the temporal resolution of synchrony perception , 2009, Experimental Brain Research.

[41]  J. Lewald,et al.  Cross-modal perceptual integration of spatially and temporally disparate auditory and visual stimuli. , 2003, Brain research. Cognitive brain research.