Modulations of ‘late’ event-related brain potentials in humans by dynamic audiovisual speech stimuli

Lipreading reliably improve speech perception during face-to-face conversation. Within the range of good dubbing, however, adults tolerate some audiovisual (AV) discrepancies and lipreading, then, can give rise to confusion. We used event-related brain potentials (ERPs) to study the perceptual strategies governing the intermodal processing of dynamic and bimodal speech stimuli, either congruently dubbed or not. Electrophysiological analyses revealed that non-coherent audiovisual dubbings modulated in amplitude an endogenous ERP component, the N300, we compared to a 'N400-like effect' reflecting the difficulty to integrate these conflicting pieces of information. This result adds further support for the existence of a cerebral system underlying 'integrative processes' lato sensu. Further studies should take advantage of this 'N400-like effect' with AV speech stimuli to open new perspectives in the domain of psycholinguistics.

[1]  P. Kuhl,et al.  Integral processing of visual place and auditory voicing information during phonetic perception. , 1991, Journal of experimental psychology. Human perception and performance.

[2]  F Grosjean,et al.  Spoken word recognition processes and the gating paradigm , 1980, Perception & psychophysics.

[3]  E Donchin,et al.  A new method for off-line removal of ocular artifact. , 1983, Electroencephalography and clinical neurophysiology.

[4]  Denis Fize,et al.  Speed of processing in the human visual system , 1996, Nature.

[5]  B Renault,et al.  Differential processing of part-to-whole and part-to-part face priming: an ERP study. , 1999, Neuroreport.

[6]  Mikko Sams,et al.  Processing of changes in visual speech in the human auditory cortex. , 2002, Brain research. Cognitive brain research.

[7]  M. Sams,et al.  Electrophysiological indicators of phonetic and non-phonetic multisensory interactions during audiovisual speech perception. , 2003, Brain research. Cognitive brain research.

[8]  D W Massaro,et al.  American Psychological Association, Inc. Evaluation and Integration of Visual and Auditory Information in Speech Perception , 2022 .

[9]  E. Bullmore,et al.  Activation of auditory cortex during silent lipreading. , 1997, Science.

[10]  P. Baudonniere,et al.  Evidence of a visual-to-auditory cross-modal sensory gating phenomenon as reflected by the human P50 event-related brain potential modulation , 2003, Neuroscience Letters.

[11]  H. McGurk,et al.  Hearing lips and seeing voices , 1976, Nature.

[12]  B. Walden,et al.  Effects of training on the visual recognition of consonants. , 1977, Journal of speech and hearing research.

[13]  E. Plante,et al.  Time course of word identification and semantic integration in spoken language. , 1999, Journal of experimental psychology. Learning, memory, and cognition.

[14]  R. Hari,et al.  Seeing speech: visual information from lip movements modifies activity in the human auditory cortex , 1991, Neuroscience Letters.

[15]  P. McGuire,et al.  Silent speechreading in the absence of scanner noise: an event‐related fMRI study , 2000, Neuroreport.

[16]  M. Kutas,et al.  Reading senseless sentences: brain potentials reflect semantic incongruity. , 1980, Science.

[17]  M. Rugg Event-related brain potentials dissociate repetition effects of high-and low-frequency words , 1990, Memory & cognition.

[18]  R. Campbell,et al.  Hearing by eye : the psychology of lip-reading , 1988 .

[19]  Michael D. Rugg,et al.  Word and Nonword Repetition Within- and Across-Modality: An Event-Related Potential Study , 1995, Journal of Cognitive Neuroscience.

[20]  Q. Summerfield Some preliminaries to a comprehensive account of audio-visual speech perception. , 1987 .

[21]  K. Grant,et al.  Auditory-visual speech recognition by hearing-impaired subjects: consonant recognition, sentence recognition, and auditory-visual integration. , 1998, The Journal of the Acoustical Society of America.

[22]  P. Holcomb,et al.  An electrophysiological investigation of semantic priming with pictures of real objects. , 1999, Psychophysiology.

[23]  Peter Brown,et al.  A common N400 EEG component reflecting contextual integration irrespective of symbolic form , 2004, Clinical Neurophysiology.

[24]  W. H. Sumby,et al.  Visual contribution to speech intelligibility in noise , 1954 .

[25]  M. Rugg,et al.  Event-related potentials and the semantic matching of pictures , 1990, Brain and Cognition.

[26]  M. Kutas,et al.  Brain potentials during reading reflect word expectancy and semantic association , 1984, Nature.

[27]  John Polich,et al.  N400s from sentences, semantic categories, number and letter strings? , 1985 .

[28]  T. Allison,et al.  Temporal Cortex Activation in Humans Viewing Eye and Mouth Movements , 1998, The Journal of Neuroscience.

[29]  D E Callan,et al.  Multimodal contribution to speech perception revealed by independent component analysis: a single-sweep EEG case study. , 2001, Brain research. Cognitive brain research.

[30]  M. Rugg,et al.  The effects of task on the modulation of event-related potentials by word repetition. , 1988, Psychophysiology.