A sex difference in visual influence on heard speech

Reports of sex differences in language processing are inconsistent and are thought to vary by task type and difficulty. In two experiments, we investigated a sex difference in visual influence on heard speech (the McGurk effect). First, incongruent consonant-vowel stimuli were presented where the visual portion of the signal was brief (100 msec) or full (temporally equivalent to the auditory). Second, to determine whether men and women differed in their ability to extract visual speech information from these brief stimuli, the same stimuli were presented to new participants with an additional visual-only (lipread) condition. In both experiments, women showed a significantly greater visual influence on heard speech than did men for the brief visual stimuli. No sex differences for the full stimuli or in the ability to lipread were found. These findings indicate that the more challenging brief visual stimuli elicit sex differences in the processing of audiovisual speech.

[1]  E. Ringelstein,et al.  Handedness and hemispheric language dominance in healthy humans. , 2000, Brain : a journal of neurology.

[2]  J. Gore,et al.  A comparison of bound and unbound audio-visual information processing in the human cerebral cortex. , 2002, Brain research. Cognitive brain research.

[3]  Y. Sugita,et al.  Auditory-visual speech perception examined by fMRI and PET , 2003, Neuroscience Research.

[4]  L. Katz,et al.  Sex differences in the functional organization of the brain for language , 1995, Nature.

[5]  K. Munhall,et al.  Gaze behavior in audiovisual speech perception: The influence of ocular fixations on the McGurk effect , 2003, Perception & psychophysics.

[6]  K. Green The Use of Auditory and Visual Information in Phonetic Perception , 1996 .

[7]  A. Meltzoff,et al.  Integrating speech information across talkers, gender, and sensory modality: Female faces and male voices in the McGurk effect , 1991, Perception & psychophysics.

[8]  R. Campbell,et al.  Evidence from functional magnetic resonance imaging of crossmodal binding in the human heteromodal cortex , 2000, Current Biology.

[9]  J. Werker,et al.  An exploration of why preschoolers perform differently than do adults in audiovisual speech perception tasks. , 1997, Journal of experimental child psychology.

[10]  H. Traunmüller,et al.  Audiovisual perception of Swedish vowels with and without conflicting cues , 2004 .

[11]  G. Calvert Crossmodal processing in the human brain: insights from functional neuroimaging studies. , 2001, Cerebral cortex.

[12]  L. A. Flashman,et al.  Sex differences in semantic language processing: A functional MRI study , 2003, Brain and Language.

[13]  L. Rosenblum,et al.  The McGurk effect in infants , 1997 .

[14]  L. Katz,et al.  Cerebral organization of component processes in reading. , 1996, Brain : a journal of neurology.

[15]  M. Funnell,et al.  Hemispheric contributions to the integration of visual and auditory information in speech perception , 1994, Perception & psychophysics.

[16]  W. H. Sumby,et al.  Visual contribution to speech intelligibility in noise , 1954 .

[17]  Dani Byrd,et al.  Auditory Selective Attention: An fMRI Investigation , 1996, NeuroImage.

[18]  David B. Pisoni,et al.  Detection of Auditory-Visual Asynchrony in Speech and Nonspeech Signals 1 , 2004 .

[19]  T. Goldberg,et al.  Sex differences in lipreading , 1988 .

[20]  Wolfgang Grodd,et al.  Hemispheric Lateralization Effects of Rhythm Implementation during Syllable Repetitions: An fMRI Study , 2002, NeuroImage.

[21]  C. Fowler,et al.  Listening with eye and hand: cross-modal contributions to speech perception. , 1991, Journal of experimental psychology. Human perception and performance.

[22]  K G Munhall,et al.  Audiovisual gating and the time course of speech perception. , 1998, The Journal of the Acoustical Society of America.

[23]  J. Freyd,et al.  The mental representation of movement when static stimuli are viewed , 1983, Perception & psychophysics.

[24]  J. Werker,et al.  Is the integration of heard and seen speech mandatory for infants? , 2004, Developmental psychobiology.

[25]  D. Pisoni Research on Spoken Language Processing. Progress Report No. 21 (1996-1997). , 1997 .

[26]  Lawrence Brancazio,et al.  Lexical influences in audiovisual speech perception. , 2004, Journal of experimental psychology. Human perception and performance.

[27]  B. Turetsky,et al.  An fMRI Study of Sex Differences in Regional Activation to a Verbal and a Spatial Task , 2000, Brain and Language.

[28]  R. Campbell,et al.  Audiovisual Integration of Speech Falters under High Attention Demands , 2005, Current Biology.

[29]  T. Houtgast,et al.  Integration of visual and auditory information in speech perception , 2004 .

[30]  M. Myslobodsky,et al.  Differences in Susceptibility to the “Blending Illusion” Among Native Hebrew and English Speakers , 1996, Brain and Language.

[31]  Nathalie Tzourio-Mazoyer,et al.  Hemispheric specialization for language , 2004, Brain Research Reviews.

[32]  Lawrence Brancazio,et al.  Use of visual information in speech perception: Evidence for a visual rate effect both with and without a McGurk effect , 2005, Perception & psychophysics.

[33]  David Poeppel,et al.  Visual speech speeds up the neural processing of auditory speech. , 2005, Proceedings of the National Academy of Sciences of the United States of America.

[34]  Kenji Kansaku,et al.  Imaging studies on sex differences in the lateralization of language , 2001, Neuroscience Research.

[35]  R W Cox,et al.  Language processing is strongly left lateralized in both sexes. Evidence from functional MRI. , 1999, Brain : a journal of neurology.

[36]  H. McGurk,et al.  Hearing lips and seeing voices , 1976, Nature.

[37]  J MacDonald,et al.  Hearing by Eye: How Much Spatial Degradation can Be Tolerated? , 2000, Perception.

[38]  R. Campbell,et al.  Hearing by eye 2 : advances in the psychology of speechreading and auditory-visual speech , 1997 .

[39]  E. Bullmore,et al.  Response amplification in sensory-specific cortices during crossmodal binding. , 1999, Neuroreport.

[40]  C. Watson,et al.  Auditory and visual speech perception: confirmation of a modality-independent source of individual differences in speech recognition. , 1996, The Journal of the Acoustical Society of America.

[41]  Edward T. Bullmore,et al.  Sex Differences in Functional Brain Activation during a Lexical Visual Field Task , 2002, Brain and Language.

[42]  V. Mann,et al.  Contrast effects do not underlie effects of preceding liquids on stop-consonant identification by humans. , 2000, Journal of experimental psychology. Human perception and performance.

[43]  J. Freyd,et al.  Representing the dynamics of a static form , 1983, Memory & cognition.

[44]  K. Green,et al.  Acoustic cues to place of articulation and the McGurk effect: the role of release bursts, aspiration, and formant transitions. , 1997, Journal of speech, language, and hearing research : JSLHR.

[45]  E. Diesch Left and Right Hemifield Advantages of Fusions and Combinations in Audiovisual Speech Perception , 1995, The Quarterly journal of experimental psychology. A, Human experimental psychology.

[46]  J J Jaeger,et al.  Sex differences in brain regions activated by grammatical and reading tasks , 1998, Neuroreport.

[47]  Jeffery A. Jones,et al.  Brain activity during audiovisual speech perception: An fMRI study of the McGurk effect , 2003, Neuroreport.

[48]  R. L. Majeres Sex differences in phonological processes: Speeded matching and word reading , 1999, Memory & cognition.

[49]  D W Massaro,et al.  American Psychological Association, Inc. Evaluation and Integration of Visual and Auditory Information in Speech Perception , 2022 .

[50]  Jeffrey Coney,et al.  Lateral Asymmetry in Phonological Processing: Relating Behavioral Measures to Neuroimaged Structures , 2002, Brain and Language.

[51]  L. Rosenblum,et al.  An audiovisual test of kinematic primitives for visual speech perception. , 1996, Journal of experimental psychology. Human perception and performance.

[52]  R. Campbell,et al.  Reading Speech from Still and Moving Faces: The Neural Substrates of Visible Speech , 2003, Journal of Cognitive Neuroscience.

[53]  Tobias S. Andersen,et al.  Visual attention modulates audiovisual speech perception , 2004 .