Impact of language on audiovisual speech perception examined by fMRI

Both auditory and visual information plays an important role for audiovisual speech perception during face-to-face communication. Several behavioral studies have shown that native English speakers and native Japanese speakers behaved differently in audiovisual speech perception. We hypothesized that there would be differences in neural processing between native English speakers and native Japanese speakers. Twentytwo English language speakers and 22 Japanese speakers watched talker’s face and listened to talker’s speaking during functional magnetic resonance imaging (fMRI) scanning. The lateral occipitotemporal gyrus was associated with visual domain of audiovisual speech perception in native Japanese speakers, but not in native English speakers, suggesting that language environment affects neural processes for audiovisual speech perception.

[1]  Audrey R. Nath,et al.  fMRI-Guided Transcranial Magnetic Stimulation Reveals That the Superior Temporal Sulcus Is a Cortical Locus of the McGurk Effect , 2010, The Journal of Neuroscience.

[2]  H. McGurk,et al.  Hearing lips and seeing voices , 1976, Nature.

[3]  R. Zatorre,et al.  Voice-selective areas in human auditory cortex , 2000, Nature.

[4]  D. Burnham,et al.  Impact of language on development of auditory-visual speech perception. , 2008, Developmental science.

[5]  O. Bertrand,et al.  Visual Activation and Audiovisual Interactions in the Auditory Cortex during Speech Perception: Intracranial Recordings in Humans , 2008, The Journal of Neuroscience.

[6]  P. McGuire,et al.  Cortical substrates for the perception of face actions: an fMRI study of the specificity of activation for seen speech and for meaningless lower-face acts (gurning). , 2001, Brain research. Cognitive brain research.

[7]  Jeffery A. Jones,et al.  Multisensory Integration Sites Identified by Perception of Spatial Wavelet Filtered Visual Speech Gesture Information , 2004, Journal of Cognitive Neuroscience.

[8]  S. Zeki A vision of the brain , 1993 .

[9]  Michael S Beauchamp,et al.  See me, hear me, touch me: multisensory integration in lateral occipital-temporal cortex , 2005, Current Opinion in Neurobiology.

[10]  J. Zihl,et al.  Speechreading in the akinetopsic patient , 1997 .

[11]  Y. Sugita,et al.  Auditory-visual speech perception examined by fMRI and PET , 2003, Neuroscience Research.

[12]  L. Bernstein,et al.  Visual speech perception without primary auditory cortex activation , 2002, Neuroreport.

[14]  U. Noppeney,et al.  Distinct Functional Contributions of Primary Sensory and Association Areas to Audiovisual Integration in Object Categorization , 2010, The Journal of Neuroscience.

[15]  Y. Tohkura,et al.  McGurk effect in non-English listeners: few visual effects for Japanese subjects hearing Japanese syllables of high auditory intelligibility. , 1991, The Journal of the Acoustical Society of America.

[16]  Terry M. Peters,et al.  3D statistical neuroanatomical models from 305 MRI volumes , 1993, 1993 IEEE Conference Record Nuclear Science Symposium and Medical Imaging Conference.

[17]  Antoinette T. Gesi,et al.  Bimodal speech perception: an examination across languages , 1993 .

[18]  E. T. Possing,et al.  Human temporal lobe activation by speech and nonspeech sounds. , 2000, Cerebral cortex.

[19]  E. Bullmore,et al.  Activation of auditory cortex during silent lipreading. , 1997, Science.

[20]  R. Campbell,et al.  Evidence from functional magnetic resonance imaging of crossmodal binding in the human heteromodal cortex , 2000, Current Biology.