Sensory and Striatal Areas Integrate Auditory and Visual Signals into Behavioral Benefits during Motion Discrimination

For effective interactions with our dynamic environment, it is critical for the brain to integrate motion information from the visual and auditory senses. Combining fMRI and psychophysics, this study investigated how the human brain integrates auditory and visual motion into benefits in motion discrimination. Subjects discriminated the motion direction of audiovisual stimuli that contained directional motion signal in the auditory, visual, audiovisual, or no modality at two levels of signal reliability. Therefore, this 2 × 2 × 2 factorial design manipulated: (1) auditory motion information (signal vs noise), (2) visual motion information (signal vs noise), and (3) reliability of motion signal (intact vs degraded). Behaviorally, subjects benefited significantly from audiovisual integration primarily for degraded auditory and visual motion signals while obtaining near ceiling performance for “unisensory” signals when these were reliable and intact. At the neural level, we show audiovisual motion integration bilaterally in the visual motion areas hMT+/V5+ and implicate the posterior superior temporal gyrus/planum temporale in auditory motion processing. Moreover, we show that the putamen integrates audiovisual signals into more accurate motion discrimination responses. Our results suggest audiovisual integration processes at both the sensory and response selection levels. In all of these regions, the operational profile of audiovisual integration followed the principle of inverse effectiveness, in which audiovisual response suppression for intact stimuli turns into response enhancements for degraded stimuli. This response profile parallels behavioral indices of audiovisual integration, in which subjects benefit significantly from audiovisual integration only for the degraded conditions.

[1]  Joost X. Maier,et al.  Natural, Metaphoric, and Linguistic Auditory Direction Signals Have Distinct Influences on Visual Motion Processing , 2009, The Journal of Neuroscience.

[2]  Mark W Greenlee,et al.  Neural correlates of coherent audiovisual motion perception. , 2007, Cerebral cortex.

[3]  S. Hillyard,et al.  Neural Basis of the Ventriloquist Illusion , 2007, Current Biology.

[4]  B. Stein,et al.  Visual, auditory, and somatosensory convergence on cells in superior colliculus results in multisensory integration. , 1986, Journal of neurophysiology.

[5]  Michael Petrides,et al.  Response selection versus feedback analysis in conditional visuo-motor learning , 2012, NeuroImage.

[6]  Alan C. Evans,et al.  An MRI-based stereotactic atlas from 250 young normal subjects , 1992 .

[7]  J. Rauschecker,et al.  Perception of Sound-Source Motion by the Human Brain , 2002, Neuron.

[8]  R. Goebel,et al.  Integration of Letters and Speech Sounds in the Human Brain , 2004, Neuron.

[9]  John J. Foxe,et al.  Grabbing your ear: rapid auditory-somatosensory multisensory interactions in low-level sensory cortices are not constrained by stimulus alignment. , 2005, Cerebral cortex.

[10]  E. DeYoe,et al.  A comparison of visual and auditory motion processing in human cerebral cortex. , 2000, Cerebral cortex.

[11]  G. DeAngelis,et al.  Multisensory Integration in Macaque Visual Cortex Depends on Cue Reliability , 2008, Neuron.

[12]  John J. Foxe,et al.  The timing and laminar profile of converging inputs to multisensory areas of the macaque neocortex. , 2002, Brain research. Cognitive brain research.

[13]  N. Swindale,et al.  Diffusion tensor fiber tracking shows distinct corticostriatal circuits in humans , 2004, Annals of neurology.

[14]  S. Wuerger,et al.  Cross-modal integration of auditory and visual motion signals , 2001, Neuroreport.

[15]  T. Stanford,et al.  Superadditivity in multisensory integration: putting the computation in context. , 2007, Neuroreport.

[16]  Hans-Jochen Heinze,et al.  A movement-sensitive area in auditory cortex , 1999, Nature.

[17]  John J. Foxe,et al.  Multisensory contributions to low-level, ‘unisensory’ processing , 2005, Current Opinion in Neurobiology.

[18]  G. DeAngelis,et al.  A Normalization Model of Multisensory Integration , 2011, Nature Neuroscience.

[19]  Sophie M. Wuerger,et al.  Low-level integration of auditory and visual motion signals requires spatial co-localisation , 2005, Experimental Brain Research.

[20]  E. Macaluso,et al.  Multisensory spatial interactions: a window onto functional integration in the human brain , 2005, Trends in Neurosciences.

[21]  M. Wallace,et al.  Representation and integration of multiple sensory inputs in primate superior colliculus. , 1996, Journal of neurophysiology.

[22]  E. Macaluso,et al.  A Common Cortical Substrate Activated by Horizontal and Vertical Sound Movement in the Human Brain , 2002, Current Biology.

[23]  B. Stein,et al.  Interactions among converging sensory inputs in the superior colliculus. , 1983, Science.

[24]  W. K. Simmons,et al.  Circular analysis in systems neuroscience: the dangers of double dipping , 2009, Nature Neuroscience.

[25]  Karl J. Friston,et al.  How the brain learns to see objects and faces in an impoverished context , 1997, Nature.

[26]  David Alais,et al.  No direction-specific bimodal facilitation for audiovisual motion detection. , 2004, Brain research. Cognitive brain research.

[27]  Jean-Luc Anton,et al.  Region of interest analysis using an SPM toolbox , 2010 .

[28]  T. Stanford,et al.  Evaluating the Operations Underlying Multisensory Integration in the Cat Superior Colliculus , 2005, The Journal of Neuroscience.

[29]  R. Buxton,et al.  Modeling the hemodynamic response to brain activation , 2004, NeuroImage.

[30]  G. E. Alexander,et al.  Parallel organization of functionally segregated circuits linking basal ganglia and cortex. , 1986, Annual review of neuroscience.

[31]  Christoph Kayser,et al.  Behavioral/systems/cognitive Functional Imaging Reveals Visual Modulation of Specific Fields in Auditory Cortex , 2022 .

[32]  Karl J. Friston,et al.  A critique of functional localisers , 2006, NeuroImage.

[33]  Megan A. K. Peters,et al.  0 + 1 > 1 , 2012, Psychological science.

[34]  T. Stanford,et al.  Multisensory integration: current issues from the perspective of the single neuron , 2008, Nature Reviews Neuroscience.

[35]  Sidney S. Simon,et al.  Merging of the Senses , 2008, Front. Neurosci..

[36]  G F Meyer,et al.  The integration of auditory and visual motion signals at threshold , 2003, Perception & psychophysics.

[37]  K. Zilles,et al.  Polymodal Motion Processing in Posterior Parietal and Premotor Cortex A Human fMRI Study Strongly Implies Equivalencies between Humans and Monkeys , 2001, Neuron.

[38]  A. King,et al.  Multisensory integration. , 1993, Science.

[39]  T. Stanford,et al.  Challenges in quantifying multisensory integration: alternative criteria, models, and inverse effectiveness , 2009, Experimental Brain Research.

[40]  Megan A. K. Peters,et al.  How Adding Noninformative Sound Improves Performance on a Visual Task , 2011 .

[41]  U. Noppeney,et al.  Distinct Functional Contributions of Primary Sensory and Association Areas to Audiovisual Integration in Object Categorization , 2010, The Journal of Neuroscience.

[42]  Mikhail A. Semenov,et al.  Climate variability and crop yields in Europe , 1999, Nature.

[43]  A. Watson,et al.  Quest: A Bayesian adaptive psychometric method , 1983, Perception & psychophysics.

[44]  U. Noppeney,et al.  Audiovisual Synchrony Improves Motion Discrimination via Enhanced Connectivity between Early Visual and Auditory Areas , 2010, The Journal of Neuroscience.

[45]  C. Schroeder,et al.  Neuronal Oscillations and Multisensory Interaction in Primary Auditory Cortex , 2007, Neuron.

[46]  P. Schyns,et al.  Superstitious Perceptions Reveal Properties of Internal Representations , 2003, Psychological science.

[47]  G. DeAngelis,et al.  Multisensory integration: psychophysics, neurophysiology, and computation , 2009, Current Opinion in Neurobiology.

[48]  Uta Noppeney,et al.  Physical and Perceptual Factors Shape the Neural Mechanisms That Integrate Audiovisual Signals in Speech Comprehension , 2011, The Journal of Neuroscience.

[49]  W. Singer,et al.  Capture of Auditory Motion by Vision Is Represented by an Activation Shift from Auditory to Visual Motion Cortex , 2008, The Journal of Neuroscience.

[50]  P. Strick,et al.  Basal Ganglia Output and Cognition: Evidence from Anatomical, Behavioral, and Clinical Studies , 2000, Brain and Cognition.

[51]  A. Ghazanfar,et al.  Is neocortex essentially multisensory? , 2006, Trends in Cognitive Sciences.

[52]  Simon B. Eickhoff,et al.  A new SPM toolbox for combining probabilistic cytoarchitectonic maps and functional imaging data , 2005, NeuroImage.

[53]  U. Noppeney,et al.  Superadditive responses in superior temporal sulcus predict audiovisual benefits in object categorization. , 2010, Cerebral cortex.

[54]  R. Henson Neuroimaging studies of priming , 2003, Progress in Neurobiology.

[55]  Mark W. Greenlee,et al.  Neural Correlates of Coherent Audiovisual Motion Perception , 2007 .

[56]  R. Dolan,et al.  Contrast polarity and face recognition in the human fusiform gyrus , 1999, Nature Neuroscience.

[57]  Karl J. Friston,et al.  Statistical parametric maps in functional imaging: A general linear approach , 1994 .

[58]  Bruno B Averbeck,et al.  Integration of Auditory and Visual Communication Information in the Primate Ventrolateral Prefrontal Cortex , 2006, The Journal of Neuroscience.