Oscillatory activity in auditory cortex reflects the perceptual level of audio-tactile integration

Cross-modal interactions between sensory channels have been shown to depend on both the spatial disparity and the perceptual similarity between the presented stimuli. Here we investigate the behavioral and neural integration of auditory and tactile stimulus pairs at different levels of spatial disparity. Additionally, we modulated the amplitudes of both stimuli in either a coherent or non-coherent manner. We found that both auditory and tactile localization performance was biased towards the stimulus in the respective other modality. This bias linearly increases with stimulus disparity and is more pronounced for coherently modulated stimulus pairs. Analyses of electroencephalographic (EEG) activity at temporal–cortical sources revealed enhanced event-related potentials (ERPs) as well as decreased alpha and beta power during bimodal as compared to unimodal stimulation. However, while the observed ERP differences are similar for all stimulus combinations, the extent of oscillatory desynchronization varies with stimulus disparity. Moreover, when both stimuli were subjectively perceived as originating from the same direction, the reduction in alpha and beta power was significantly stronger. These observations suggest that in the EEG the level of perceptual integration is mainly reflected by changes in ongoing oscillatory activity.

[1]  Riitta Hari,et al.  Touch activates human auditory cortex , 2006, NeuroImage.

[2]  Mario Maiworm,et al.  Integrative processing of perception and reward in an auditory localization paradigm. , 2011, Experimental psychology.

[3]  Paul G. Hewitt Focus on Physics: When What You See Is What You Hear , 2017 .

[4]  P. Mitra,et al.  Analysis of dynamic brain imaging data. , 1998, Biophysical journal.

[5]  D. Burr,et al.  The Ventriloquist Effect Results from Near-Optimal Bimodal Integration , 2004, Current Biology.

[6]  O. Jensen,et al.  Asymmetric Amplitude Modulations of Brain Oscillations Generate Slow Evoked Responses , 2008, The Journal of Neuroscience.

[7]  D. M. Green,et al.  Sound localization by human listeners. , 1991, Annual review of psychology.

[8]  John J. Foxe,et al.  Auditory-somatosensory multisensory processing in auditory association cortex: an fMRI study. , 2002, Journal of neurophysiology.

[9]  Mark A. Ericson,et al.  The Environment for Auditory Research , 2009 .

[10]  Veikko Jousmäki,et al.  Evidence of vibrotactile input to human auditory cortex , 2006, NeuroImage.

[11]  O. Jensen,et al.  Posterior alpha activity is not phase-reset by visual stimuli. , 2006, Proceedings of the National Academy of Sciences of the United States of America.

[12]  Michael S. Beauchamp,et al.  Touch, sound and vision in human superior temporal sulcus , 2008, NeuroImage.

[13]  R. Oostenveld,et al.  Neuronal Dynamics Underlying High- and Low-Frequency EEG Oscillations Contribute Independently to the Human BOLD Signal , 2011, Neuron.

[14]  Robert Oostenveld,et al.  FieldTrip: Open Source Software for Advanced Analysis of MEG, EEG, and Invasive Electrophysiological Data , 2010, Comput. Intell. Neurosci..

[15]  W. David Hairston,et al.  Multisensory enhancement of localization under conditions of induced myopia , 2003, Experimental Brain Research.

[16]  C. Schroeder,et al.  Neuronal Oscillations and Multisensory Interaction in Primary Auditory Cortex , 2007, Neuron.

[17]  Donald B. Percival,et al.  Multitaper spectral estimation of power law processes , 1998, IEEE Trans. Signal Process..

[18]  W. Klimesch,et al.  Event-related phase reorganization may explain evoked neural dynamics , 2007, Neuroscience & Biobehavioral Reviews.

[19]  B. Stein,et al.  The Merging of the Senses , 1993 .

[20]  G. Recanzone,et al.  Temporal and spatial dependency of the ventriloquism effect , 2001, Neuroreport.

[21]  P. Bertelson,et al.  Cross-modal bias and perceptual fusion with auditory-visual spatial discordance , 1981, Perception & psychophysics.

[22]  O. Jensen,et al.  Posterior α activity is not phase-reset by visual stimuli , 2006 .

[23]  Steven Lemm,et al.  A novel mechanism for evoked responses in the human brain , 2007, The European journal of neuroscience.

[24]  S. Shimojo,et al.  Illusions: What you see is what you hear , 2000, Nature.

[25]  Chien-Cheng Tseng,et al.  A weighted least-squares method for the design of stable 1-D and 2-D IIR digital filters , 1998, IEEE Trans. Signal Process..

[26]  C. Spence,et al.  Multisensory Integration: Maintaining the Perception of Synchrony , 2003, Current Biology.

[27]  M. Wallace,et al.  Visual Localization Ability Influences Cross-Modal Bias , 2003, Journal of Cognitive Neuroscience.

[28]  C. Spence,et al.  Tactile “capture” of audition , 2002, Perception & psychophysics.

[29]  Andreas K. Engel,et al.  Attention Modulates Visual-Tactile Interaction in Spatial Pattern Matching , 2014, PloS one.

[30]  N. Logothetis,et al.  Integration of Touch and Sound in Auditory Cortex , 2005, Neuron.

[31]  M. Wallace,et al.  Unifying multisensory signals across time and space , 2004, Experimental Brain Research.

[32]  J. Vroomen,et al.  Intersensory binding across space and time: A tutorial review , 2013, Attention, Perception, & Psychophysics.

[33]  Peter König,et al.  Sensory Augmentation for the Blind , 2012, Front. Hum. Neurosci..

[34]  I ROCK,et al.  Vision and Touch: An Experimentally Created Conflict between the Two Senses , 1964, Science.

[35]  I. Grothe,et al.  Amplitude Asymmetry: A Direct Link between Ongoing Oscillatory Activity and Event-Related Potentials? , 2008, The Journal of Neuroscience.

[36]  John J. Foxe,et al.  Grabbing your ear: rapid auditory-somatosensory multisensory interactions in low-level sensory cortices are not constrained by stimulus alignment. , 2005, Cerebral cortex.

[37]  C. Schroeder,et al.  Somatosensory input to auditory association cortex in the macaque monkey. , 2001, Journal of neurophysiology.

[38]  Andreas K. Engel,et al.  Oscillatory signatures of crossmodal congruence effects: An EEG investigation employing a visuotactile pattern matching paradigm , 2015, NeuroImage.

[39]  L. Kaufman,et al.  Handbook of perception and human performance , 1986 .

[40]  John J. Foxe,et al.  Multisensory auditory-somatosensory interactions in early cortical processing revealed by high-density electrical mapping. , 2000, Brain research. Cognitive brain research.

[41]  Brigitte Röder,et al.  Tactile capture of auditory localization: an event‐related potential study , 2010, The European journal of neuroscience.

[42]  S. Hillyard,et al.  Neural Basis of the Ventriloquist Illusion , 2007, Current Biology.

[43]  Terrence J. Sejnowski,et al.  An Information-Maximization Approach to Blind Separation and Blind Deconvolution , 1995, Neural Computation.

[44]  W. Sutherling,et al.  IFCN guidelines for topographic and frequency analysis of EEGs and EPs.The International Federation of Clinical Neurophysiology. , 1999, Electroencephalography and clinical neurophysiology. Supplement.

[45]  Arnaud Delorme,et al.  EEGLAB: an open source toolbox for analysis of single-trial EEG dynamics including independent component analysis , 2004, Journal of Neuroscience Methods.

[46]  Sabine U. König,et al.  The experience of new sensorimotor contingencies by sensory augmentation , 2014, Consciousness and Cognition.

[47]  John J. Foxe,et al.  Multisensory contributions to low-level, ‘unisensory’ processing , 2005, Current Opinion in Neurobiology.

[48]  Konrad Paul Kording,et al.  Causal Inference in Multisensory Perception , 2007, PloS one.

[49]  R. Oostenveld,et al.  Nonparametric statistical testing of EEG- and MEG-data , 2007, Journal of Neuroscience Methods.

[50]  R. Cholewiak,et al.  Vibrotactile localization on the abdomen: Effects of place and space , 2004, Perception & psychophysics.

[51]  Hans-Jochen Heinze,et al.  Tactile stimulation and hemispheric asymmetries modulate auditory perception and neural responses in primary auditory cortex , 2013, NeuroImage.