Benefits of facial and textual information in understanding of vocoded speech

Exposure to audiovisually presented vocoded speech is more effective than exposure to auditory-only vocoded speech in improving the subsequent ability to understand vocoded speech [1]. In addition, improvements in the audiovisual training condition were more rapid and greater in magnitude than in the auditory-only condition. The current study was conducted to establish whether exposure to concurrent textual information also results in improvements in the ability to recognize vocoded speech. Baseline measures of identification performance with auditory-only vocoded speech sentences were assessed for 45 participants. Participants then performed a speech identification task, where they were exposed to vocoded speech in either audiovisual (Group 1), auditory-only (Group 2), or auditory-only with concurrent text conditions (Group 3). Following exposure, participants were tested again on identification performance with auditory-only vocoded speech. Exposure to concurrent text improved subsequent understanding of vocoded speech, to a level similar to that seen with audiovisual speech exposure. In a second experiment, groups of normal hearing adults were exposed to vocoded non-lexical nonsense words in auditory-only with text and audiovisual speech presentation conditions. Exposure to both nonsense audiovisual and concurrent text conditions improved subsequent understanding of lexical auditoryonly vocoded speech, and there was no difference between the levels of improvement. In summary, to computerbased audiovisual and concurrent text exposure improves the ability to recognize vocoded speech over exposure to auditory stimuli alone. This effect does not appear to be dependent on exposure to lexical items.

[1]  Q. Summerfield Some preliminaries to a comprehensive account of audio-visual speech perception. , 1987 .

[2]  N. P. Erber Interaction of audition and vision in the recognition of oral speech stimuli. , 1969, Journal of speech and hearing research.

[3]  R. Campbell,et al.  Evidence from functional magnetic resonance imaging of crossmodal binding in the human heteromodal cortex , 2000, Current Biology.

[4]  Matthew H. Davis,et al.  Lexical information drives perceptual learning of distorted speech: evidence from the comprehension of noise-vocoded sentences. , 2005, Journal of experimental psychology. General.

[5]  M T Cord,et al.  Effects of Amplification and Speechreading on Consonant Recognition by Persons with Impaired Hearing , 2001, Ear and hearing.

[6]  A. Macleod,et al.  Quantifying the contribution of vision to speech perception in noise. , 1987, British journal of audiology.

[7]  Paula C. Stacey,et al.  Auditory-perceptual training using a simulation of a cochlear-implant system: a controlled study , 2005 .

[8]  N. P. Erber Auditory-visual perception of speech. , 1975, The Journal of speech and hearing disorders.

[9]  W. H. Sumby,et al.  Visual contribution to speech intelligibility in noise , 1954 .

[10]  A. Macleod,et al.  A procedure for measuring auditory and audio-visual speech-reception thresholds for sentences in noise: rationale, evaluation, and recommendations for use. , 1990, British journal of audiology.

[11]  David Poeppel,et al.  Visual speech speeds up the neural processing of auditory speech. , 2005, Proceedings of the National Academy of Sciences of the United States of America.

[12]  H. McGurk,et al.  Hearing lips and seeing voices , 1976, Nature.

[13]  R Plomp,et al.  The effect of speechreading on the speech-reception threshold of sentences in noise. , 1987, The Journal of the Acoustical Society of America.

[14]  F. Zeng,et al.  Speech recognition with altered spectral distribution of envelope cues. , 1996, The Journal of the Acoustical Society of America.

[15]  P. Arnold,et al.  Bisensory augmentation: a speechreading advantage when speech is clearly audible and intact. , 2001, British journal of psychology.

[16]  David B Pisoni,et al.  Talker and lexical effects on audiovisual word recognition by adults with cochlear implants. , 2003, Journal of speech, language, and hearing research : JSLHR.

[17]  A. Faulkner,et al.  Adaptation by normal listeners to upward spectral shifts of speech: implications for cochlear implants. , 1999, The Journal of the Acoustical Society of America.

[18]  R. Campbell,et al.  Hearing by eye : the psychology of lip-reading , 1988 .

[19]  D. Reisberg,et al.  Easy to hear but hard to understand: A lip-reading advantage with intact auditory stimuli. , 1987 .

[20]  D. Pisoni,et al.  Speech perception without traditional speech cues. , 1981, Science.

[21]  John Bamford,et al.  Speech-hearing tests and the spoken language of hearing-impaired children , 1979 .