The informativity of sound modulates crossmodal facilitation of visual discrimination: a fMRI study

Many studies have investigated behavioral crossmodal facilitation when a visual stimulus is accompanied by a concurrent task-irrelevant sound. Lippert and colleagues reported that a concurrent task-irrelevant sound reduced the uncertainty of the timing of the visual display and improved perceptional responses (informative sound). However, the neural mechanism by which the informativity of sound affected crossmodal facilitation of visual discrimination remained unclear. In this study, we used event-related functional MRI to investigate the neural mechanisms underlying the role of informativity of sound in crossmodal facilitation of visual discrimination. Significantly faster reaction times were observed when there was an informative relationship between auditory and visual stimuli. The functional MRI results showed sound informativity-induced activation enhancement including the left fusiform gyrus and the right lateral occipital complex. Further correlation analysis showed that the right lateral occipital complex was significantly correlated with the behavioral benefit in reaction times. This suggests that this region was modulated by the informative relationship within audiovisual stimuli that was learnt during the experiment, resulting in late-stage multisensory integration and enhanced behavioral responses.

[1]  Lars Muckli,et al.  Cortical Plasticity of Audio–Visual Object Representations , 2008, Cerebral cortex.

[2]  Wolfgang Ellermeier,et al.  Neural correlates of audio‐visual object recognition: Effects of implicit spatial congruency , 2012, Human brain mapping.

[3]  Koji Abe,et al.  Delayed audiovisual integration of patients with mild cognitive impairment and Alzheimer's disease compared with normal aged controls. , 2012, Journal of Alzheimer's disease : JAD.

[4]  Lars Nyberg,et al.  Cortical regions underlying successful encoding of semantically congruent and incongruent associations between common auditory and visual objects , 2011, Neuroscience Letters.

[5]  R. Bowtell,et al.  “sparse” temporal sampling in auditory fMRI , 1999, Human brain mapping.

[6]  J. Lewald,et al.  Cross-modal perceptual integration of spatially and temporally disparate auditory and visual stimuli. , 2003, Brain research. Cognitive brain research.

[7]  Huamin Yang,et al.  Spatiotemporal Relationships among Audiovisual Stimuli Modulate Auditory Facilitation of Visual Target Discrimination , 2015, Perception.

[8]  N. Bolognini,et al.  Enhancement of visual perception by crossmodal visuo-auditory interaction , 2002, Experimental Brain Research.

[9]  H E Egeth,et al.  Response time and accuracy revisited: converging support for the interactive race model. , 1993, Journal of experimental psychology. Human perception and performance.

[10]  E. Bullmore,et al.  Response amplification in sensory-specific cortices during crossmodal binding. , 1999, Neuroreport.

[11]  J. Maunsell,et al.  Sensory modality specificity of neural activity related to memory in visual cortex. , 1997, Journal of neurophysiology.

[12]  Xueting Li,et al.  The Role of Top-Down Task Context in Learning to Perceive Objects , 2010, The Journal of Neuroscience.

[13]  Tetsuo Touge,et al.  Multisensory Interactions Elicited by Audiovisual Stimuli Presented Peripherally in a Visual Attention Task: A Behavioral and Event-Related Potential Study in Humans , 2009, Journal of clinical neurophysiology : official publication of the American Electroencephalographic Society.

[14]  B. Argall,et al.  Integration of Auditory and Visual Information about Objects in Superior Temporal Sulcus , 2004, Neuron.

[15]  Eric Maris,et al.  Attentional Cues Affect Accuracy and Reaction Time via Different Cognitive and Neural Processes , 2012, The Journal of Neuroscience.

[16]  Karl J. Friston,et al.  How the brain learns to see objects and faces in an impoverished context , 1997, Nature.

[17]  Christoph M. Michel,et al.  Rapid discrimination of visual and multisensory memories revealed by electrical neuroimaging , 2004, NeuroImage.

[18]  R. Campbell,et al.  Evidence from functional magnetic resonance imaging of crossmodal binding in the human heteromodal cortex , 2000, Current Biology.

[19]  Tetsuo Touge,et al.  The temporal reliability of sound modulates visual detection: An event-related potential study , 2015, Neuroscience Letters.

[20]  Michael T. Lippert,et al.  Improvement of visual contrast detection by a simultaneous sound , 2007, Brain Research.

[21]  M. Farah,et al.  Role of left inferior prefrontal cortex in retrieval of semantic knowledge: a reevaluation. , 1997, Proceedings of the National Academy of Sciences of the United States of America.

[22]  Ryan A. Stevenson,et al.  Superadditive BOLD activation in superior temporal sulcus with threshold non-speech objects , 2007, Experimental Brain Research.

[23]  J. Rieger,et al.  Audiovisual Temporal Correspondence Modulates Human Multisensory Superior Temporal Sulcus Plus Primary Sensory Cortices , 2007, The Journal of Neuroscience.