Synchrony of audio–visual speech stimuli modulates left superior temporal sulcus

The superior temporal sulcus has been suggested to play a significant role in the integration of auditory and visual sensory information. Here, we presented vowels and short video clips of the corresponding articulatory gestures to healthy adult humans with two auditory–visual stimulus intervals during sparse sampling 3-T functional magnetic resonance imaging to detect which brain areas are sensitive to synchrony of speech sounds and associated articulatory gestures. The upper bank of the left middle superior temporal sulcus showed stronger activation during naturally asynchronous stimulation than during simultaneous stimulus presentation. It is possible that this reflects sensitivity of the left middle superior temporal sulcus to temporal synchrony of audio–visual speech stimuli.

[1]  Mikko Sams,et al.  Processing of audiovisual speech in Broca's area , 2005, NeuroImage.

[2]  Rainer Goebel,et al.  The effect of temporal asynchrony on the multisensory integration of letters and speech sounds. , 2006, Cerebral cortex.

[3]  Joost X. Maier,et al.  Multisensory Integration of Dynamic Faces and Voices in Rhesus Monkey Auditory Cortex , 2005 .

[4]  Lee M. Miller,et al.  Behavioral/systems/cognitive Perceptual Fusion and Stimulus Coincidence in the Cross- Modal Integration of Speech , 2022 .

[5]  J. Belliveau,et al.  Short-term plasticity in auditory cognition , 2007, Trends in Neurosciences.

[6]  J. Rauschecker,et al.  Vowel sound extraction in anterior superior temporal cortex , 2006, Human brain mapping.

[7]  D. Poeppel,et al.  The cortical organization of speech processing , 2007, Nature Reviews Neuroscience.

[8]  C. Spence,et al.  Multisensory Integration: Maintaining the Perception of Synchrony , 2003, Current Biology.

[9]  C. Price,et al.  The Constraints Functional Neuroimaging Places on Classical Models of Auditory Word Processing , 2001, Journal of Cognitive Neuroscience.

[10]  David A. Medler,et al.  Cerebral Cortex doi:10.1093/cercor/bhi040 Cerebral Cortex Advance Access published February 9, 2005 , 2022 .

[11]  J. Rieger,et al.  Audiovisual Temporal Correspondence Modulates Human Multisensory Superior Temporal Sulcus Plus Primary Sensory Cortices , 2007, The Journal of Neuroscience.

[12]  H. Scheich,et al.  Phonetic Perception and the Temporal Cortex , 2002, NeuroImage.

[13]  D. Poeppel,et al.  Temporal window of integration in auditory-visual speech perception , 2007, Neuropsychologia.

[14]  E Macaluso,et al.  Spatial and temporal factors during processing of audiovisual speech: a PET study , 2004, NeuroImage.

[15]  M. Hallett,et al.  Neural Correlates of Auditory–Visual Stimulus Onset Asynchrony Detection , 2001, The Journal of Neuroscience.

[16]  P. Matthews,et al.  Defining a left-lateralized response specific to intelligible speech using fMRI. , 2003, Cerebral cortex.

[17]  E. T. Possing,et al.  Human temporal lobe activation by speech and nonspeech sounds. , 2000, Cerebral cortex.

[18]  R. Zatorre,et al.  Voice-selective areas in human auditory cortex , 2000, Nature.

[19]  E. Bullmore,et al.  Activation of auditory cortex during silent lipreading. , 1997, Science.

[20]  Roy D. Patterson,et al.  Locating the initial stages of speech–sound processing in human temporal cortex , 2006, NeuroImage.

[21]  H. McGurk,et al.  Hearing lips and seeing voices , 1976, Nature.

[22]  Richard S. Frackowiak,et al.  Putaminal activity is related to perceptual certainty , 2008, NeuroImage.

[23]  Stephen M. Smith,et al.  Temporal Autocorrelation in Univariate Linear Modeling of FMRI Data , 2001, NeuroImage.

[24]  J. Gore,et al.  A comparison of bound and unbound audio-visual information processing in the human cerebral cortex. , 2002, Brain research. Cognitive brain research.

[25]  M. Sams,et al.  Primary auditory cortex activation by visual speech: an fMRI study at 3 T , 2005, Neuroreport.