Neural mechanisms for the effect of prior knowledge on audiovisual integration

Converging evidence indicates that prior knowledge plays an important role in multisensory integration. However, the neural mechanisms underlying the processes with which prior knowledge is integrated with current sensory information remains unknown. In this study, we measured event-related potentials (ERPs) while manipulating prior knowledge using a novel visual letter recognition task in which auditory information was always presented simultaneously. The color of the letters was assigned to a particular probability of being associated with audiovisual congruency (e.g., green=high probability (HP) and blue=low probability (LP)). Results demonstrate that this prior began affecting reaction times to the congruent audiovisual stimuli at about the 900th trial. Consequently, the ERP data was analyzed in two phases: the "early phase" (trial 900). The effects of prior knowledge were revealed through difference waveforms generated by subtracting the ERPs for the congruent audiovisual stimuli in the LP condition from those in the HP condition. A frontal-central probability effect (90-120 ms) was observed in the early phase. A right parietal-occipital probability effect (40-96 ms) and a frontal-central probability effect (170-200 ms) were observed in the late phase. The results suggest that during the initial acquisition of the knowledge about the probability of congruency, the brain assigned more attention to audiovisual stimuli for the LP condition. Following the acquisition of this prior knowledge, it was then used during early stages of visual processing and modulated the activity of multisensory cortical areas.

[1]  M. Ernst,et al.  Optimal integration of shape information from vision and touch , 2007, Experimental Brain Research.

[2]  H. Bülthoff,et al.  Merging the senses into a robust percept , 2004, Trends in Cognitive Sciences.

[3]  R. Goebel,et al.  Integration of Letters and Speech Sounds in the Human Brain , 2004, Neuron.

[4]  M. Scherg,et al.  A Source Analysis of the Late Human Auditory Evoked Potentials , 1989, Journal of Cognitive Neuroscience.

[5]  J. Qiu,et al.  The effect of visual reliability on auditory–visual integration: an event-related potential study , 2007, Neuroreport.

[6]  J. Saunders,et al.  Do humans optimally integrate stereo and texture information for judgments of surface slant? , 2003, Vision Research.

[7]  R. Desimone,et al.  Visual properties of neurons in a polysensory area in superior temporal sulcus of the macaque. , 1981, Journal of neurophysiology.

[8]  Jonas F. Vibell,et al.  Neuroimaging of multisensory processing in vision, audition, touch, and olfaction , 2004, Cognitive Processing.

[9]  Klucharev Vasily,et al.  Electrophysiological indicators of phonetic and non-phonetic multisensory interactions during audiovisual speech perception. , 2003 .

[10]  Marty G. Woldorff,et al.  Selective Attention and Multisensory Integration: Multiple Phases of Effects on the Evoked Brain Activity , 2005, Journal of Cognitive Neuroscience.

[11]  Rufin VanRullen,et al.  Top-down and bottom-up modulation in processing bimodal face/voice stimuli , 2010, BMC Neuroscience.

[12]  N. A. Borghese,et al.  A functional-anatomical model for lipreading. , 2003, Journal of neurophysiology.

[13]  R. Jacobs,et al.  Experience-dependent visual cue integration based on consistencies between visual and haptic percepts , 2001, Vision Research.

[14]  D E Callan,et al.  Multimodal contribution to speech perception revealed by independent component analysis: a single-sweep EEG case study. , 2001, Brain research. Cognitive brain research.

[15]  K. Hikosaka The polysensory region in the anterior bank of the caudal superior temporal sulcus of the macaque monkey , 1993 .

[16]  L. Benevento,et al.  Auditory-visual interaction in single cells in the cortex of the superior temporal sulcus and the orbital frontal cortex of the macaque monkey , 1977, Experimental Neurology.

[17]  Rainer Goebel,et al.  Task‐irrelevant visual letters interact with the processing of speech sounds in heteromodal and unimodal cortex , 2008, The European journal of neuroscience.

[18]  Wei Ji Ma,et al.  Linking neurons to behavior in multisensory perception: A computational review , 2008, Brain Research.

[19]  J. Pernier,et al.  Dynamics of cortico-subcortical cross-modal operations involved in audio-visual object detection in humans. , 2002, Cerebral cortex.

[20]  M. Frens,et al.  Spatial and temporal factors determine auditory-visual interactions in human saccadic eye movements , 1995, Perception & psychophysics.

[21]  E. Macaluso Multisensory Processing in Sensory-Specific Cortical Areas , 2006, The Neuroscientist : a review journal bringing neurobiology, neurology and psychiatry.

[22]  John J. Foxe,et al.  Multisensory auditory-somatosensory interactions in early cortical processing revealed by high-density electrical mapping. , 2000, Brain research. Cognitive brain research.

[23]  Y. Sugita,et al.  Auditory-visual speech perception examined by fMRI and PET , 2003, Neuroscience Research.

[24]  M. Ernst,et al.  Humans integrate visual and haptic information in a statistically optimal fashion , 2002, Nature.

[25]  John J. Foxe,et al.  Multisensory auditory-visual interactions during early sensory processing in humans: a high-density electrical mapping study. , 2002, Brain research. Cognitive brain research.

[26]  G. A. Calvert,et al.  Auditory-visual processing represented in the human superior temporal gyrus , 2007, Neuroscience.

[27]  L. Busse,et al.  The spread of attention across modalities and space in a multisensory object. , 2005, Proceedings of the National Academy of Sciences of the United States of America.

[28]  Konrad Paul Kording,et al.  Causal Inference in Multisensory Perception , 2007, PloS one.

[29]  M. Woldorff,et al.  The electrophysiological time course of the interaction of stimulus conflict and the multisensory spread of attention , 2010, The European journal of neuroscience.

[30]  Gregory McCarthy,et al.  Polysensory interactions along lateral temporal regions evoked by audiovisual speech. , 2003, Cerebral cortex.

[31]  R. Hari,et al.  Seeing speech: visual information from lip movements modifies activity in the human auditory cortex , 1991, Neuroscience Letters.

[32]  A. Fort,et al.  Bimodal speech: early suppressive visual effects in human auditory cortex , 2004, The European journal of neuroscience.

[33]  Rainer Goebel,et al.  The effect of temporal asynchrony on the multisensory integration of letters and speech sounds. , 2006, Cerebral cortex.

[34]  N. Bolognini,et al.  Enhancement of visual perception by crossmodal visuo-auditory interaction , 2002, Experimental Brain Research.

[35]  M. Woldorff,et al.  Selective attention and audiovisual integration: is attending to both modalities a prerequisite for early integration? , 2006, Cerebral cortex.

[36]  Albert Postma,et al.  Multisensory integration affects ERP components elicited by exogenous cues , 2008, Experimental Brain Research.

[37]  E. Bullmore,et al.  Response amplification in sensory-specific cortices during crossmodal binding. , 1999, Neuroreport.

[38]  Jean-Pierre Bresciani,et al.  Vision and touch are automatically integrated for the perception of sequences of events. , 2006, Journal of vision.

[39]  Jean Vroomen,et al.  Neural Correlates of Multisensory Integration of Ecologically Valid Audiovisual Events , 2007, Journal of Cognitive Neuroscience.

[40]  David Poeppel,et al.  Visual speech speeds up the neural processing of auditory speech. , 2005, Proceedings of the National Academy of Sciences of the United States of America.

[41]  A. Ghazanfar,et al.  Is neocortex essentially multisensory? , 2006, Trends in Cognitive Sciences.

[42]  M. Giard,et al.  Auditory-Visual Integration during Multimodal Object Recognition in Humans: A Behavioral and Electrophysiological Study , 1999, Journal of Cognitive Neuroscience.

[43]  B. Argall,et al.  Integration of Auditory and Visual Information about Objects in Superior Temporal Sulcus , 2004, Neuron.

[44]  R. J. van Beers,et al.  Integration of proprioceptive and visual position-information: An experimentally supported model. , 1999, Journal of neurophysiology.

[45]  N. Bruneau,et al.  Cross-modal processing of auditory–visual stimuli in a no-task paradigm: A topographic event-related potential study , 2008, Clinical Neurophysiology.

[46]  B. Ross,et al.  The Auditory Evoked “Off” Response: Sources and Comparison with the"On" and the “Sustained” Responses , 1996, Ear and hearing.

[47]  R. Campbell,et al.  Evidence from functional magnetic resonance imaging of crossmodal binding in the human heteromodal cortex , 2000, Current Biology.

[48]  John J. Foxe,et al.  The timing and laminar profile of converging inputs to multisensory areas of the macaque neocortex. , 2002, Brain research. Cognitive brain research.

[49]  Riitta Hari,et al.  Audiovisual Integration of Letters in the Human Brain , 2000, Neuron.

[50]  J. Kaiser,et al.  Object Familiarity and Semantic Congruency Modulate Responses in Cortical Audiovisual Integration Areas , 2007, The Journal of Neuroscience.

[51]  R. Jacobs,et al.  Optimal integration of texture and motion cues to depth , 1999, Vision Research.

[52]  J. Brebner,et al.  Bisensory presentation of information. , 1970, Psychological bulletin.

[53]  C. Spence,et al.  Evaluating the influence of the 'unity assumption' on the temporal perception of realistic audiovisual stimuli. , 2008, Acta psychologica.

[54]  M HERSHENSON,et al.  Reaction time as a measure of intersensory facilitation. , 1962, Journal of experimental psychology.

[55]  Heinrich H. Bülthoff,et al.  Touch can change visual slant perception , 2000, Nature Neuroscience.

[56]  D. Burr,et al.  The Ventriloquist Effect Results from Near-Optimal Bimodal Integration , 2004, Current Biology.

[57]  D. H. Warren,et al.  Immediate perceptual response to intersensory discrepancy. , 1980, Psychological bulletin.

[58]  R. Welch Chapter 15 Meaning, attention, and the “unity assumption” in the intersensory bias of spatial and temporal perceptions , 1999 .

[59]  M. Scherg,et al.  Intracerebral Sources of Human Auditory-Evoked Potentials , 1999, Audiology and Neurotology.

[60]  John J. Foxe,et al.  Grabbing your ear: rapid auditory-somatosensory multisensory interactions in low-level sensory cortices are not constrained by stimulus alignment. , 2005, Cerebral cortex.

[61]  S. Gepshtein,et al.  Viewing Geometry Determines How Vision and Haptics Combine in Size Perception , 2003, Current Biology.

[62]  M. Sams,et al.  Electrophysiological indicators of phonetic and non-phonetic multisensory interactions during audiovisual speech perception. , 2003, Brain research. Cognitive brain research.

[63]  Steven A. Hillyard,et al.  Effects of Spatial Congruity on Audio-Visual Multimodal Integration , 2005, Journal of Cognitive Neuroscience.

[64]  L. Nolan,et al.  Biological psychology , 2019, An Introduction to the Psychology of Humor.

[65]  E. Macaluso,et al.  Multisensory spatial interactions: a window onto functional integration in the human brain , 2005, Trends in Neurosciences.

[66]  S A Hillyard,et al.  An analysis of audio-visual crossmodal integration by means of event-related potential (ERP) recordings. , 2002, Brain research. Cognitive brain research.

[67]  G. Calvert,et al.  Multisensory integration: methodological approaches and emerging principles in the human brain , 2004, Journal of Physiology-Paris.

[68]  M. Kutas,et al.  Combined perception of emotion in pictures and musical sounds , 2006, Brain Research.

[69]  John J. Foxe,et al.  Dual mechanisms for the cross-sensory spread of attention: how much do learned associations matter? , 2010, Cerebral cortex.