Measures of auditory-visual integration in nonsense syllables and sentences.
暂无分享,去创建一个
[1] W. H. Sumby,et al. Visual contribution to speech intelligibility in noise , 1954 .
[2] G. A. Miller,et al. An Analysis of Perceptual Confusions Among Some English Consonants , 1955 .
[3] D. Massaro. Preperceptual images, processing time, and perceptual units in auditory perception. , 1972, Psychological review.
[4] H. McGurk,et al. Hearing lips and seeing voices , 1976, Nature.
[5] B E Walden,et al. Some effects of training on speech recognition by hearing-impaired adults. , 1981, Journal of speech and hearing research.
[6] Q. Summerfield,et al. Intermodal timing relations and audio-visual speech recognition by normal-hearing adults. , 1985, The Journal of the Acoustical Society of America.
[7] Louis D. Braida,et al. Evaluating the articulation index for auditory-visual input. , 1987, The Journal of the Acoustical Society of America.
[8] A. Boothroyd,et al. Mathematical treatment of context effects in phoneme and word recognition. , 1988, The Journal of the Acoustical Society of America.
[9] B E Walden,et al. Visual biasing of normal and impaired auditory speech perception. , 1990, Journal of speech and hearing research.
[10] L. Braida. Crossmodal Integration in the Identification of Consonant Segments , 1991, The Quarterly journal of experimental psychology. A, Human experimental psychology.
[11] A. Meltzoff,et al. Integrating speech information across talkers, gender, and sensory modality: Female faces and male voices in the McGurk effect , 1991, Perception & psychophysics.
[12] Marjorie R. Leek,et al. Informational masking and auditory attention , 1991, Perception & psychophysics.
[13] Q. Summerfield,et al. Lipreading and audio-visual speech perception. , 1992, Philosophical transactions of the Royal Society of London. Series B, Biological sciences.
[14] W M Rabinowitz,et al. Relations among different measures of speech reception in subjects using a cochlear implant. , 1992, The Journal of the Acoustical Society of America.
[15] I. Winkler,et al. Event-related brain potentials reflect traces of echoic memory in humans , 1993, Perception & psychophysics.
[16] Ken W. Grant,et al. Evaluating the articulation index for auditory–visual consonant recognition , 1993 .
[17] D W Massaro,et al. Perception of asynchronous and conflicting visual and auditory speech. , 1996, The Journal of the Acoustical Society of America.
[18] L. Bernstein,et al. Generalizability of speechreading performance on nonsense syllables, words, and sentences: subjects with normal hearing. , 1996, Journal of speech and hearing research.
[19] C. Watson,et al. Auditory and visual speech perception: confirmation of a modality-independent source of individual differences in speech recognition. , 1996, The Journal of the Acoustical Society of America.
[20] A Baddeley,et al. The fractionation of working memory. , 1996, Proceedings of the National Academy of Sciences of the United States of America.
[21] W O Olsen,et al. Phoneme and Word Recognition for Words in Isolation and in Sentences , 1997, Ear and hearing.
[22] K. Grant,et al. Auditory-visual speech recognition by hearing-impaired subjects: consonant recognition, sentence recognition, and auditory-visual integration. , 1998, The Journal of the Acoustical Society of America.