Sound-driven enhancement of vision: disentangling detection-level from decision-level contributions.

Cross-modal enhancement can be mediated both by higher-order effects due to attention and decision making and by detection-level stimulus-driven interactions. However, the contribution of each of these sources to behavioral improvements has not been conclusively determined and quantified separately. Here, we apply psychophysical analysis based on Piéron functions in order to separate stimulus-dependent changes from those accounted by decision-level contributions. Participants performed a simple visual speeded detection task on Gabor patches of different spatial frequencies and contrast values, presented with and without accompanying sounds. On one hand, we identified an additive cross-modal improvement in mean reaction times across all types of visual stimuli that would be well explained by interactions not strictly based on stimulus-driven modulations (e.g., due to reduction of temporal uncertainty and motor times). On the other hand, we singled out an audio-visual benefit that strongly depended on stimulus features such as frequency and contrast. This particular enhancement was selective to low-visual spatial frequency stimuli, optimized for magnocellular sensitivity. We therefore conclude that interactions at detection stages and at decisional processes in response selection that contribute to audio-visual enhancement can be separated online and express on partly different aspects of visual processing.

[1]  R J Snowden,et al.  Identification of Visual Stimuli is Improved by Accompanying Auditory Stimuli: The Role of Eye Movements and Sound Location , 2001, Perception.

[2]  A. Diederich,et al.  Bimodal and trimodal multisensory enhancement: Effects of stimulus onset and intensity on reaction time , 2004, Perception & psychophysics.

[3]  B. Stein,et al.  Visual, auditory, and somatosensory convergence on cells in superior colliculus results in multisensory integration. , 1986, Journal of neurophysiology.

[4]  John J. Foxe,et al.  Multisensory contributions to low-level, ‘unisensory’ processing , 2005, Current Opinion in Neurobiology.

[5]  K. R. Ridderinkhof,et al.  Error-related brain potentials are differentially related to awareness of response errors: evidence from an antisaccade task. , 2001, Psychophysiology.

[6]  Quick Rf A vector-magnitude model of contrast detection. , 1974 .

[7]  J. Driver,et al.  Sound-Induced Enhancement of Low-Intensity Vision: Multisensory Influences on Human Sensory-Specific Cortices and Thalamic Bodies Relate to Perceptual Enhancement of Visual Detection Sensitivity , 2010, The Journal of Neuroscience.

[8]  J. Valls-Solé,et al.  Reaction time and acoustic startle in normal human subjects , 1995, Neuroscience Letters.

[9]  D. Berson,et al.  Retinal and cortical inputs to cat superior colliculus: composition, convergence and laminar specificity. , 1988, Progress in brain research.

[10]  P. Mamassian,et al.  Audiovisual integration of stimulus transients , 2008, Vision Research.

[11]  T. Stanford,et al.  Challenges in quantifying multisensory integration: alternative criteria, models, and inverse effectiveness , 2009, Experimental Brain Research.

[12]  B. Stein,et al.  Enhancement of Perceived Visual Intensity by Auditory Stimuli: A Psychophysical Analysis , 1996, Journal of Cognitive Neuroscience.

[13]  D. Raab DIVISION OF PSYCHOLOGY: STATISTICAL FACILITATION OF SIMPLE REACTION TIMES* , 1962 .

[14]  Arnaud Falchier,et al.  Multisensory connections of monkey auditory cerebral cortex , 2009, Hearing Research.

[15]  H. Piéron,et al.  II. Recherches sur les lois de variation des temps de latence sensorielle en fonction des intensités excitatrices , 1913 .

[16]  D. Pins,et al.  On the relation between stimulus intensity and processing time: Piéron’s law and choice reaction time , 1996, Perception & psychophysics.

[17]  D. Levi,et al.  Suprathreshold spatial frequency detection and binocular interaction in strabismic and anisometropic amblyopia. , 1979, Investigative ophthalmology & visual science.

[18]  M. Wallace,et al.  Multisensory integration in the superior colliculus of the alert cat. , 1998, Journal of neurophysiology.

[19]  Lawrence G. McDade,et al.  Behavioral Indices of Multisensory Integration: Orientation to Visual Cues is Affected by Auditory Stimuli , 1989, Journal of Cognitive Neuroscience.

[20]  Y. Benjamini,et al.  Controlling the false discovery rate: a practical and powerful approach to multiple testing , 1995 .

[21]  L M Ward,et al.  Multisensory integration and crossmodal attention effects in the human brain. , 2001, Science.

[22]  FRANK MORRELL,et al.  Visual System's View of Acoustic Space , 1972, Nature.

[23]  I. Murray,et al.  Contrast coding and magno/parvo segregation revealed in reaction time studies , 2003, Vision Research.

[24]  John H. R. Maunsell,et al.  How parallel are the primate visual pathways? , 1993, Annual review of neuroscience.

[25]  S. Hillyard,et al.  Neural basis of auditory-induced shifts in visual time-order perception , 2005, Nature Neuroscience.

[26]  Jeff Miller,et al.  Divided attention: Evidence for coactivation with redundant signals , 1982, Cognitive Psychology.

[27]  Geraint Rees,et al.  Sound alters activity in human V1 in association with illusory visual perception , 2006, NeuroImage.

[28]  D. Knill,et al.  The role of memory in visually guided reaching. , 2007, Journal of vision.

[29]  M. Wallace,et al.  Converging influences from visual, auditory, and somatosensory cortices onto output neurons of the superior colliculus. , 1993, Journal of neurophysiology.

[30]  S. Iversen,et al.  Detection of Audio-Visual Integration Sites in Humans by Application of Electrophysiological Criteria to the BOLD Effect , 2001, NeuroImage.

[31]  D. Burr,et al.  Two stages of visual processing for radial and circular motion , 1995, Nature.

[32]  Claudio R. Mirasso,et al.  Effects of auditory noise on the psychophysical detection of visual signals: Cross-modal stochastic resonance , 2007, Neuroscience Letters.

[33]  D. Hubel,et al.  Do the relative mapping densities of the magno- and parvocellular systems vary with eccentricity? , 1988, The Journal of neuroscience : the official journal of the Society for Neuroscience.

[34]  V. Perry,et al.  The topography of magnocellular projecting ganglion cells (M-ganglion cells) in the primate retina , 1991, Neuroscience.

[35]  Uta Noppeney,et al.  The contributions of transient and sustained response codes to audiovisual integration. , 2011, Cerebral cortex.

[36]  Joan López-Moliner,et al.  Vision affects how fast we hear sounds move. , 2007, Journal of vision.

[37]  C Bonnet,et al.  The Piéron function in the threshold region , 2000, Perception & psychophysics.

[38]  Lawrence E Marks,et al.  Cross-modal enhancement of perceived brightness: Sensory interaction versus response bias , 2003, Perception & psychophysics.

[39]  N. Bolognini,et al.  Enhancement of visual perception by crossmodal visuo-auditory interaction , 2002, Experimental Brain Research.

[40]  Derek H. Arnold,et al.  Audio-Visual Speech Cue Combination , 2010, PloS one.

[41]  T. Stanford,et al.  Multisensory Integration Shortens Physiological Response Latencies , 2007, The Journal of Neuroscience.

[42]  D. Mitov,et al.  How many pathways determine the speed of grating detection? , 2005, Vision Research.

[43]  R. A. Kinchla,et al.  Detecting target elements in multielement arrays: A confusability model , 1974 .

[44]  G. Thut,et al.  Selective Enhancement of Visual Cortex Excitability by Looming Sounds , 2009, NeuroImage.

[45]  Hans-Jochen Heinze,et al.  Sound increases the saliency of visual events , 2008, Brain Research.

[46]  Gregor Thut,et al.  Selective integration of auditory-visual looming cues by humans , 2009, Neuropsychologia.

[47]  L. Harris,et al.  Sounds can affect visual perception mediated primarily by the parvocellular pathway , 2009, Visual Neuroscience.

[48]  C. Schroeder,et al.  Neuronal Oscillations and Multisensory Interaction in Primary Auditory Cortex , 2007, Neuron.

[49]  H. Bülthoff,et al.  Merging the senses into a robust percept , 2004, Trends in Cognitive Sciences.

[50]  H. C. Hughes,et al.  Parallel and serial processes in the human oculomotor system: bimodal integration and express saccades , 2004, Biological Cybernetics.

[51]  D. Raab Statistical facilitation of simple reaction times. , 1962, Transactions of the New York Academy of Sciences.

[52]  Rolf Ulrich,et al.  Testing the race model inequality: An algorithm and computer programs , 2007, Behavior research methods.

[53]  B. Stein,et al.  Spatial determinants of multisensory integration in cat superior colliculus neurons. , 1996, Journal of neurophysiology.

[54]  Dennis M. Levi,et al.  Reaction time as a measure of suprathreshold grating detection , 1978, Vision Research.

[55]  David M. Green,et al.  Detection of Multiple Component Signals in Noise , 1958 .

[56]  I. J Murray,et al.  Neurophysiological interpretation of human visual reaction times: effect of contrast, spatial frequency and luminance , 2000, Neuropsychologia.

[57]  B. Stein,et al.  Interactions among converging sensory inputs in the superior colliculus. , 1983, Science.

[58]  R. H. S. Carpenter,et al.  Neural computation of log likelihood in control of saccadic eye movements , 1995, Nature.

[59]  H Colonius,et al.  A two-stage model for visual-auditory interaction in saccadic latencies , 2001, Perception & psychophysics.

[60]  Jan Theeuwes,et al.  Efficient Visual Search from Synchronized Auditory Signals Requires Transient Audiovisual Events , 2010, PloS one.

[61]  I. Forsythe Multisensory integration for orientation and movement , 2011, The Journal of physiology.

[62]  John J. Foxe,et al.  Multisensory auditory-visual interactions during early sensory processing in humans: a high-density electrical mapping study. , 2002, Brain research. Cognitive brain research.

[63]  N. Bolognini,et al.  “Acoustical vision” of below threshold stimuli: interaction among spatially converging audiovisual inputs , 2004, Experimental Brain Research.

[64]  G. Logan,et al.  On the ability to inhibit simple and choice reaction time responses: a model and a method. , 1984, Journal of experimental psychology. Human perception and performance.

[65]  G. Legge Sustained and transient mechanisms in human vision: Temporal and spatial properties , 1978, Vision Research.

[66]  Michael T. Lippert,et al.  Improvement of visual contrast detection by a simultaneous sound , 2007, Brain Research.

[67]  G F Meyer,et al.  The integration of auditory and visual motion signals at threshold , 2003, Perception & psychophysics.

[68]  J. Driver,et al.  Multisensory Interplay Reveals Crossmodal Influences on ‘Sensory-Specific’ Brain Regions, Neural Responses, and Judgments , 2008, Neuron.

[69]  Salvador Soto-Faraco,et al.  Audiovisual contrast enhancement is articulated primarily via the M-pathway , 2010, Brain Research.

[70]  H. Kennedy,et al.  Anatomical Evidence of Multimodal Integration in Primate Striate Cortex , 2002, The Journal of Neuroscience.

[71]  Fabrizio Leo,et al.  Multisensory integration for orienting responses in humans requires the activation of the superior colliculus , 2008, Experimental Brain Research.

[72]  B. Stein Neural mechanisms for synthesizing sensory information and producing adaptive behaviors , 1998, Experimental Brain Research.

[73]  M. Keeling,et al.  Involuntary orienting to sound improves visual perception , 2022 .

[74]  Cj Spence,et al.  Covert orienting in audition: Endogenous and exogenous mechanisms. , 1994 .

[75]  R S Nickerson,et al.  Intersensory facilitation of reaction time: energy summation or preparation enhancement? , 1973, Psychological review.

[76]  Paul J. Laurienti,et al.  Semantic congruence is a critical factor in multisensory behavioral performance , 2004, Experimental Brain Research.

[77]  Jon Driver,et al.  Covert Spatial Orienting in Audition: Exogenous and Endogenous Mechanisms , 1994 .

[78]  M. Giard,et al.  Auditory-Visual Integration during Multimodal Object Recognition in Humans: A Behavioral and Electrophysiological Study , 1999, Journal of Cognitive Neuroscience.

[79]  Michel Treisman,et al.  Combining Information: Probability Summation and Probability Averaging in Detection and Discrimination , 1998 .

[80]  A J Van Opstal,et al.  Auditory-visual interactions subserving goal-directed saccades in a complex scene. , 2002, Journal of neurophysiology.

[81]  R. Carpenter,et al.  Contrast, Probability, and Saccadic Latency Evidence for Independence of Detection and Decision , 2004, Current Biology.

[82]  Sophie M. Wuerger,et al.  Low-level integration of auditory and visual motion signals requires spatial co-localisation , 2005, Experimental Brain Research.

[83]  Scrutinizing integrative effects in a multi-stimuli detection task , 2012 .

[84]  Bruno G. Breitmeyer,et al.  Simple reaction time as a measure of the temporal response properties of transient and sustained channels , 1975, Vision Research.

[85]  J. P. Thomas,et al.  One spatial filter limits speed of detecting low and middle frequency gratings , 1999, Vision Research.

[86]  C. Spence,et al.  Cross-modal links in spatial attention. , 1998, Philosophical transactions of the Royal Society of London. Series B, Biological sciences.

[87]  Pascal Mamassian,et al.  Noise and Correlations in Parallel Perceptual Decision Making , 2011 .