Seeing speech affects acoustic information processing in the human brainstem

Afferent auditory processing in the human brainstem is often assumed to be determined by acoustic stimulus features alone and immune to stimulation by other senses or cognitive factors. In contrast, we show that lipreading during speech perception influences early acoustic processing. Event-related brainstem potentials were recorded from ten healthy adults to concordant (acoustic-visual match), conflicting (acoustic-visual mismatch) and unimodal stimuli. Audiovisual (AV) interactions occurred as early as ∼11 ms post-acoustic stimulation and persisted for the first 30 ms of the response. Furthermore, the magnitude of interaction depended on AV pairings. These findings indicate considerable plasticity in early auditory processing.

[1]  T W Picton,et al.  Human auditory evoked potentials. II. Effects of attention. , 1974, Electroencephalography and clinical neurophysiology.

[2]  Noam Chomsky,et al.  The Logical Structure of Linguistic Theory , 1975 .

[3]  John J. Foxe,et al.  Multisensory auditory-visual interactions during early sensory processing in humans: a high-density electrical mapping study. , 2002, Brain research. Cognitive brain research.

[4]  Mikko Sams,et al.  Processing of changes in visual speech in the human auditory cortex. , 2002, Brain research. Cognitive brain research.

[5]  G. Plant Perceiving Talking Faces: From Speech Perception to a Behavioral Principle , 1999 .

[6]  G C Galbraith,et al.  Intelligible speech encoded in the human brain stem frequency-following response. , 1995, Neuroreport.

[7]  M. Merzenich,et al.  Origins of the scalp recorded frequency-following response in the cat. , 1979, Audiology : official organ of the International Society of Audiology.

[8]  T. Carrell,et al.  Effects of lengthened formant transition duration on discrimination and neural representation of synthetic CV syllables by normal and learning-disabled children. , 1999, The Journal of the Acoustical Society of America.

[9]  A. Møller Neural mechanisms of BAEP. , 1999, Electroencephalography and clinical neurophysiology. Supplement.

[10]  Eric I. Knudsen,et al.  The optic tectum controls visually guided adaptive plasticity in the owl's auditory space map , 2002, Nature.

[11]  E. T. Possing,et al.  Human temporal lobe activation by speech and nonspeech sounds. , 2000, Cerebral cortex.

[12]  R. Hari,et al.  Viewing Lip Forms Cortical Dynamics , 2002, Neuron.

[13]  S. Scott,et al.  The neuroanatomical and functional organization of speech perception , 2003, Trends in Neurosciences.

[14]  J Hohnsbein,et al.  Early attention effects in human auditory-evoked potentials. , 2000, Psychophysiology.

[15]  D E Callan,et al.  Multimodal contribution to speech perception revealed by independent component analysis: a single-sweep EEG case study. , 2001, Brain research. Cognitive brain research.

[16]  S. Hillyard,et al.  Involuntary orienting to sound improves visual perception , 2000, Nature.

[17]  Nina Kraus,et al.  Atypical brainstem representation of onset and formant structure of speech sounds in children with language-based learning problems , 2004, Biological Psychology.

[18]  R. Campbell,et al.  Hearing by Eye , 1980, The Quarterly journal of experimental psychology.

[19]  E. Bullmore,et al.  Activation of auditory cortex during silent lipreading. , 1997, Science.

[20]  James W. Hall Handbook of Auditory Evoked Responses , 1991 .

[21]  Barry E Stein,et al.  Neuron-specific response characteristics predict the magnitude of multisensory integration. , 2003, Journal of neurophysiology.

[22]  L. C. Oatman,et al.  Effects of visual attention on tone burst evoked auditory potentials , 1977, Experimental Neurology.

[23]  John T. Jacobson,et al.  The Auditory brainstem response , 1985 .

[24]  G. Celesia Auditory evoked responses. Intracranial and extracranial average evoked responses. , 1968, Archives of neurology.

[25]  J. Liveson,et al.  Auditory Evoked Responses , 1999 .

[26]  Nina Kraus,et al.  Brainstem origins for cortical ‘what’ and ‘where’ pathways in the auditory system , 2005, Trends in Neurosciences.

[27]  Blaise Yvert,et al.  Simultaneous intracerebral EEG recordings of early auditory thalamic and cortical activity in human , 2002, The European journal of neuroscience.

[28]  M. Wallace,et al.  Converging influences from visual, auditory, and somatosensory cortices onto output neurons of the superior colliculus. , 1993, Journal of neurophysiology.

[29]  I. Howard,et al.  Human Spatial Orientation , 1966 .

[30]  S. Hillyard,et al.  Modulation of early auditory processing during selective listening to rapidly presented tones. , 1991, Electroencephalography and clinical neurophysiology.

[31]  E I Knudsen,et al.  A Topographic Instructive Signal Guides the Adjustment of the Auditory Space Map in the Optic Tectum , 2001, The Journal of Neuroscience.

[32]  Wan Jiang,et al.  Cortex controls multisensory depression in superior colliculus. , 2003, Journal of neurophysiology.

[33]  R. Ilmoniemi,et al.  Seeing faces activates three separate areas outside the occipital visual cortex in man , 1991, Neuroscience.

[34]  Laurence R. Harris,et al.  Auditory and visual neurons in the cat's superior colliculus selective for the direction of apparent motion stimuli , 1989, Brain Research.

[35]  Barry E Stein,et al.  Visual Experience Is Necessary for the Development of Multisensory Integration , 2004, The Journal of Neuroscience.

[36]  N. Kraus,et al.  Neurobiologic responses to speech in noise in children with learning problems: deficits and strategies for improvement , 2001, Clinical Neurophysiology.

[37]  S. Blumstein,et al.  The Role of Segmentation in Phonological Processing: An fMRI Investigation , 2000, Journal of Cognitive Neuroscience.

[38]  R. Campbell,et al.  Evidence from functional magnetic resonance imaging of crossmodal binding in the human heteromodal cortex , 2000, Current Biology.

[39]  J. T. Marsh,et al.  Far-field recorded frequency-following responses: correlates of low pitch auditory perception in humans. , 1975, Electroencephalography and clinical neurophysiology.

[40]  Y. Tohkura,et al.  McGurk effect in non-English listeners: few visual effects for Japanese subjects hearing Japanese syllables of high auditory intelligibility. , 1991, The Journal of the Acoustical Society of America.

[41]  Nina Kraus,et al.  Brain Stem Response to Speech: A Biological Marker of Auditory Processing , 2005, Ear and hearing.

[42]  Raymond D. Kent Psychobiology of speech development: coemergence of language and a movement system. , 1984, The American journal of physiology.

[43]  W. H. Sumby,et al.  Visual contribution to speech intelligibility in noise , 1954 .

[44]  P F Seitz,et al.  The use of visible speech cues for improving auditory detection of spoken sentences. , 2000, The Journal of the Acoustical Society of America.

[45]  Nina Kraus,et al.  Correlation between brainstem and cortical auditory processes in normal and language-impaired children. , 2004, Brain : a journal of neurology.

[46]  Jeffery A. Jones,et al.  Neural processes underlying perceptual enhancement by visual speech gestures , 2003, Neuroreport.

[47]  A. Liberman,et al.  The motor theory of speech perception revised , 1985, Cognition.

[48]  M. Wallace,et al.  Superior colliculus lesions preferentially disrupt multisensory orientation , 2004, Neuroscience.

[49]  T. Paus,et al.  Modulation of Motor Excitability during Speech Perception: The Role of Broca's Area , 2004, Journal of Cognitive Neuroscience.

[50]  Eric I. Knudsen,et al.  Gated Visual Input to the Central Auditory System , 2002, Science.

[51]  R. Hari,et al.  Seeing speech: visual information from lip movements modifies activity in the human auditory cortex , 1991, Neuroscience Letters.

[52]  K. Linkenkaer-Hansen,et al.  Face-selective processing in human extrastriate cortex around 120 ms after stimulus onset revealed by magneto- and electroencephalography , 1998, Neuroscience Letters.

[53]  S A Hillyard,et al.  An analysis of audio-visual crossmodal integration by means of event-related potential (ERP) recordings. , 2002, Brain research. Cognitive brain research.

[54]  Denise Brandão de Oliveira e Britto,et al.  The faculty of language , 2007 .

[55]  M. Tervaniemi,et al.  Lateralization of auditory-cortex functions , 2003, Brain Research Reviews.

[56]  A. Macleod,et al.  Quantifying the contribution of vision to speech perception in noise. , 1987, British journal of audiology.

[57]  M. Giard,et al.  Auditory-Visual Integration during Multimodal Object Recognition in Humans: A Behavioral and Electrophysiological Study , 1999, Journal of Cognitive Neuroscience.

[58]  Daniela Zambarbieri,et al.  The latency of saccades toward auditory targets in humans. , 2002, Progress in brain research.

[59]  R. Hernández-Peón,et al.  Modification of electric activity in cochlear nucleus during attention in unanesthetized cats. , 1956, Science.

[60]  G. Calvert Crossmodal processing in the human brain: insights from functional neuroimaging studies. , 2001, Cerebral cortex.

[61]  Manabu Honda,et al.  Cross-modal binding and activated attentional networks during audio-visual speech integration: a functional MRI study. , 2005, Cerebral cortex.

[62]  J. Pernier,et al.  Dynamics of cortico-subcortical cross-modal operations involved in audio-visual object detection in humans. , 2002, Cerebral cortex.

[63]  Nina Kraus,et al.  Deficits in auditory brainstem pathway encoding of speech sounds in children with learning problems , 2002, Neuroscience Letters.

[64]  Lawrence E. Marks,et al.  Visual-auditory interaction in speeded classification: Role of stimulus difference , 1995, Perception & psychophysics.

[65]  L D Rosenblum,et al.  Visual influences on auditory pluck and bow judgments , 1993, Perception & psychophysics.

[66]  H. McGurk,et al.  Visual influences on speech perception processes , 1978, Perception & psychophysics.

[67]  E. Schröger,et al.  Speeded responses to audiovisual signal changes result from bimodal integration. , 1998, Psychophysiology.

[68]  Y. Sugita,et al.  Auditory-visual speech perception examined by fMRI and PET , 2003, Neuroscience Research.

[69]  K. Green The perception of speaking rate using visual information from a talker’s face , 1987, Perception & psychophysics.

[70]  Mu-ming Poo,et al.  Turning of Retinal Growth Cones in a Netrin-1 Gradient Mediated by the Netrin Receptor DCC , 1997, Neuron.

[71]  L M Ward,et al.  Involuntary Listening Aids Seeing: Evidence From Human Electrophysiology , 2000, Psychological science.

[72]  B. Stein Neural mechanisms for synthesizing sensory information and producing adaptive behaviors , 1998, Experimental Brain Research.

[73]  R. Näätänen,et al.  Cortical activity elicited by changes in auditory stimuli: different sources for the magnetic N100m and mismatch responses. , 1991, Psychophysiology.

[74]  B. Stein,et al.  Two Corticotectal Areas Facilitate Multisensory Orientation Behavior , 2002, Journal of Cognitive Neuroscience.

[75]  M. Sams,et al.  Interaction of gaze direction and facial expressions processing: ERP study , 2004, Neuroreport.

[76]  Noam Chomsky,et al.  The faculty of language: what is it, who has it, and how did it evolve? , 2002, Science.

[77]  Barry E. Stein,et al.  Book Review: Cortex Governs Multisensory Integration in the Midbrain , 2002 .

[78]  Nina Kraus,et al.  Auditory training improves neural timing in the human brainstem , 2005, Behavioural Brain Research.

[79]  S. Hillyard,et al.  Evidence for effects of selective attention in the mid-latency range of the human auditory event-related potential. , 1987, Electroencephalography and clinical neurophysiology. Supplement.

[80]  M. Hallett,et al.  Neural Correlates of Auditory–Visual Stimulus Onset Asynchrony Detection , 2001, The Journal of Neuroscience.

[81]  Terrence R Stanford,et al.  Cortex governs multisensory integration in the midbrain. , 2002, The Neuroscientist : a review journal bringing neurobiology, neurology and psychiatry.

[82]  B. Stein,et al.  Visual, auditory, and somatosensory convergence on cells in superior colliculus results in multisensory integration. , 1986, Journal of neurophysiology.

[83]  M. Hallett,et al.  Neural correlates of cross-modal binding , 2003, Nature Neuroscience.

[84]  Nobuo Suga,et al.  Multiparametric corticofugal modulation and plasticity in the auditory system , 2003, Nature Reviews Neuroscience.

[85]  H. McGurk,et al.  Hearing lips and seeing voices , 1976, Nature.

[86]  L. Marks Bright sneezes and dark coughs, loud sunlight and soft moonlight. , 1982, Journal of experimental psychology. Human perception and performance.

[87]  P. McGuire,et al.  Cortical substrates for the perception of face actions: an fMRI study of the specificity of activation for seen speech and for meaningless lower-face acts (gurning). , 2001, Brain research. Cognitive brain research.

[88]  E. Bullmore,et al.  Response amplification in sensory-specific cortices during crossmodal binding. , 1999, Neuroreport.

[89]  Nina Kraus,et al.  Neural plasticity following auditory training in children with learning problems , 2003, Clinical Neurophysiology.

[90]  K. Grant,et al.  The effect of speechreading on masked detection thresholds for filtered speech. , 2001, The Journal of the Acoustical Society of America.

[91]  M. Wallace,et al.  Multisensory integration in the superior colliculus of the alert cat. , 1998, Journal of neurophysiology.

[92]  G. Rizzolatti,et al.  Action recognition in the premotor cortex. , 1996, Brain : a journal of neurology.

[93]  Klucharev Vasily,et al.  Electrophysiological indicators of phonetic and non-phonetic multisensory interactions during audiovisual speech perception. , 2003 .

[94]  Nina Kraus,et al.  Brainstem responses to speech syllables , 2004, Clinical Neurophysiology.

[95]  R Plomp,et al.  The effect of speechreading on the speech-reception threshold of sentences in noise. , 1987, The Journal of the Acoustical Society of America.