Temporal Characteristics of Audiovisual Information Processing

In complex natural environments, auditory and visual information often have to be processed simultaneously. Previous functional magnetic resonance imaging (fMRI) studies focused on the spatial localization of brain areas involved in audiovisual (AV) information processing, but the temporal characteristics of AV information flow in these regions remained unclear. In this study, we used fMRI and a novel information–theoretic approach to study the flow of AV sensory information. Subjects passively perceived sounds and images of objects presented either alone or simultaneously. Applying the measure of mutual information, we computed for each voxel the latency in which the blood oxygenation level-dependent signal had the highest information content about the preceding stimulus. The results indicate that, after AV stimulation, the earliest informative activity occurs in right Heschl's gyrus, left primary visual cortex, and the posterior portion of the superior temporal gyrus, which is known as a region involved in object-related AV integration. Informative activity in the anterior portion of superior temporal gyrus, middle temporal gyrus, right occipital cortex, and inferior frontal cortex was found at a later latency. Moreover, AV presentation resulted in shorter latencies in multiple cortical areas compared with isolated auditory or visual presentation. The results provide evidence for bottom-up processing from primary sensory areas into higher association areas during AV integration in humans and suggest that AV presentation shortens processing time in early sensory cortices.

[1]  John J. Foxe,et al.  Multisensory processing of naturalistic objects in motion: A high-density electrical mapping and source estimation study , 2007, NeuroImage.

[2]  David Poeppel,et al.  Visual speech speeds up the neural processing of auditory speech. , 2005, Proceedings of the National Academy of Sciences of the United States of America.

[3]  John J. Foxe,et al.  Determinants and mechanisms of attentional modulation of neural processing. , 2001, Frontiers in Bioscience.

[4]  Antigona Martínez,et al.  Source analysis of event-related cortical activity during visuo-spatial attention. , 2003, Cerebral cortex.

[5]  R. Goebel,et al.  Integration of Letters and Speech Sounds in the Human Brain , 2004, Neuron.

[6]  John J. Foxe,et al.  Multisensory contributions to low-level, ‘unisensory’ processing , 2005, Current Opinion in Neurobiology.

[7]  Gian Luca Romani,et al.  Audio-visual crossmodal interactions in environmental perception: an fMRI investigation , 2004, Cognitive Processing.

[8]  M M Cohen,et al.  Speechreading in the akinetopsic patient, L.M. , 1997, Brain : a journal of neurology.

[9]  A. Ghazanfar,et al.  Is neocortex essentially multisensory? , 2006, Trends in Cognitive Sciences.

[10]  L. Tyler,et al.  Binding crossmodal object features in perirhinal cortex. , 2006, Proceedings of the National Academy of Sciences of the United States of America.

[11]  Jochen Kaiser,et al.  Processing of location and pattern changes of natural sounds in the human auditory cortex , 2007, NeuroImage.

[12]  Rainer Goebel,et al.  The effect of temporal asynchrony on the multisensory integration of letters and speech sounds. , 2006, Cerebral cortex.

[13]  W. Singer,et al.  Retinotopic effects during spatial audio-visual integration , 2007, Neuropsychologia.

[14]  M. Giard,et al.  Auditory-Visual Integration during Multimodal Object Recognition in Humans: A Behavioral and Electrophysiological Study , 1999, Journal of Cognitive Neuroscience.

[15]  Robert T. Knight,et al.  Spatio-temporal information analysis of event-related BOLD responses , 2007, NeuroImage.

[16]  D. Heeger,et al.  Linear Systems Analysis of Functional Magnetic Resonance Imaging in Human V1 , 1996, The Journal of Neuroscience.

[17]  Robert T. Knight,et al.  Intermodal Auditory, Visual, and Tactile Attention Modulates Early Stages of Neural Processing , 2009, Journal of Cognitive Neuroscience.

[18]  C. Schroeder,et al.  Neuronal Oscillations and Multisensory Interaction in Primary Auditory Cortex , 2007, Neuron.

[19]  R. Campbell,et al.  Reading Speech from Still and Moving Faces: The Neural Substrates of Visible Speech , 2003, Journal of Cognitive Neuroscience.

[20]  Ryan A. Stevenson,et al.  Superadditive BOLD activation in superior temporal sulcus with threshold non-speech objects , 2007, Experimental Brain Research.

[21]  Steven A. Hillyard,et al.  Effects of Spatial Congruity on Audio-Visual Multimodal Integration , 2005, Journal of Cognitive Neuroscience.

[22]  R. Burkard Human Auditory Evoked Potentials , 2010 .

[23]  John J. Foxe,et al.  Multisensory visual-auditory object recognition in humans: a high-density electrical mapping study. , 2004, Cerebral cortex.

[24]  Jean-Philippe Thiran,et al.  Multisensory interactions within human primary cortices revealed by BOLD dynamics. , 2007, Cerebral cortex.

[25]  G. Celesia Organization of auditory cortical areas in man. , 1976, Brain : a journal of neurology.

[26]  John J. Foxe,et al.  The case for feedforward multisensory convergence during early cortical processing , 2005, Neuroreport.

[27]  R. Campbell,et al.  Evidence from functional magnetic resonance imaging of crossmodal binding in the human heteromodal cortex , 2000, Current Biology.

[28]  J. Zihl,et al.  Speechreading in the akinetopsic patient , 1997 .

[29]  C. Schroeder,et al.  A spatiotemporal profile of visual system activation revealed by current source density analysis in the awake macaque. , 1998, Cerebral cortex.

[30]  B. Argall,et al.  Integration of Auditory and Visual Information about Objects in Superior Temporal Sulcus , 2004, Neuron.

[31]  M A Goodale,et al.  Dynamic visual speech perception in a patient with visual form agnosia , 2002, Neuroreport.

[32]  J. Kaiser,et al.  Object Familiarity and Semantic Congruency Modulate Responses in Cortical Audiovisual Integration Areas , 2007, The Journal of Neuroscience.

[33]  M. Woldorff,et al.  Selective attention and audiovisual integration: is attending to both modalities a prerequisite for early integration? , 2006, Cerebral cortex.

[34]  S. Hillyard,et al.  Human auditory evoked potentials. I. Evaluation of components. , 1974, Electroencephalography and clinical neurophysiology.

[35]  J. Pernier,et al.  Dynamics of cortico-subcortical cross-modal operations involved in audio-visual object detection in humans. , 2002, Cerebral cortex.