For speech perception by humans or machines, three senses are better than one
暂无分享,去创建一个
[1] David B. Pisoni,et al. Multi-modal encoding of speech in memory: a first report , 1996, Proceeding of Fourth International Conference on Spoken Language Processing. ICSLP '96.
[2] C. Benoît,et al. Effects of phonetic context on audio-visual intelligibility of French. , 1994, Journal of speech and hearing research.
[3] H. McGurk,et al. Hearing lips and seeing voices , 1976, Nature.
[4] N P Erber,et al. Effects of sentence context on recognition of words through lipreading by deaf children. , 1976, Journal of speech and hearing research.
[5] A. Liberman,et al. The motor theory of speech perception revised , 1985, Cognition.
[6] M E Demorest,et al. Lipreading sentences with vibrotactile vocoders: performance of normal-hearing and hearing-impaired subjects. , 1991, The Journal of the Acoustical Society of America.
[7] Hani Yehia,et al. Characterizing audiovisual information during speech , 1996, Proceeding of Fourth International Conference on Spoken Language Processing. ICSLP '96.
[8] Charlotte M. Reed. The implications of the Tadoma method of speechreading for spoken language processing , 1996, Proceeding of Fourth International Conference on Spoken Language Processing. ICSLP '96.
[9] I. Summers. Tactile Aids for the Hearing Impaired , 1992 .
[10] Ruth Campbell,et al. Seeing speech in space and time: psychological and neurological findings , 1996, Proceeding of Fourth International Conference on Spoken Language Processing. ICSLP '96.
[11] A Boothroyd,et al. Speechreading supplemented by single-channel and multichannel tactile displays of voice fundamental frequency. , 1995, Journal of speech and hearing research.
[12] C M Reed,et al. Analytic study of the Tadoma method: improving performance through the use of supplementary tactual displays. , 1992, Journal of speech and hearing research.
[13] Q. Summerfield,et al. Intermodal timing relations and audio-visual speech recognition by normal-hearing adults. , 1985, The Journal of the Acoustical Society of America.
[14] J UTLEY. A test of lip reading ability. , 1946, The Journal of speech disorders.
[15] N. P. Erber. Interaction of audition and vision in the recognition of oral speech stimuli. , 1969, Journal of speech and hearing research.
[16] Ali Adjoudani,et al. Audio-visual speech recognition compared across two architectures , 1995, EUROSPEECH.
[17] R E Remez. Critique: auditory form and gestural topology in the perception of speech. , 1996, The Journal of the Acoustical Society of America.
[18] W. H. Sumby,et al. Visual contribution to speech intelligibility in noise , 1954 .
[19] Robert E. Remez,et al. Perceptual organization of speech in one and several modalities: common functions, common resources , 1996, Proceeding of Fourth International Conference on Spoken Language Processing. ICSLP '96.
[20] Carl E. Sherrick,et al. Basic and applied research on tactile aids for deaf people: Progress and prospects , 1984 .
[21] Eric David Petajan,et al. Automatic Lipreading to Enhance Speech Recognition (Speech Reading) , 1984 .
[22] G. A. Miller,et al. The intelligibility of speech as a function of the context of the test materials. , 1951, Journal of experimental psychology.
[23] N. Michael Brooke,et al. Using the visual component in automatic speech recognition , 1996, Proceeding of Fourth International Conference on Spoken Language Processing. ICSLP '96.
[24] D. Massaro. Speech Perception By Ear and Eye: A Paradigm for Psychological Inquiry , 1989 .
[25] Kaoru Sekiyama,et al. A few factors which affect the degree of incorporating lip-read information into speech perception , 1996, Proceeding of Fourth International Conference on Spoken Language Processing. ICSLP '96.
[26] C. Fowler. An event approach to the study of speech perception from a direct realist perspective , 1986 .
[27] R. Diehl,et al. On the Objects of Speech Perception , 1989 .
[28] J. O'neill. Contributions of the visual components of oral symbols to speech comprehension. , 1954, The Journal of speech and hearing disorders.
[29] David G. Stork,et al. Speechreading by Humans and Machines , 1996 .
[30] Kerry P. Green. Studies of the McGurk effect: implications for theories of speech perception , 1996, Proceeding of Fourth International Conference on Spoken Language Processing. ICSLP '96.
[31] Eric D. Petajan. Automatic lipreading to enhance speech recognition , 1984 .
[32] R Plomp,et al. Speechreading supplemented with auditorily presented speech parameters. , 1986, The Journal of the Acoustical Society of America.
[33] Q. Summerfield. Some preliminaries to a comprehensive account of audio-visual speech perception. , 1987 .
[34] Robert H. Gault,et al. Progress in experiments on tactual interpretation of oral speech. , 1924 .
[35] N P Erber. Effects of distance on the visual reception of speech. , 1971, Journal of speech and hearing research.
[36] B. Stein,et al. Visual, auditory, and somatosensory convergence on cells in superior colliculus results in multisensory integration. , 1986, Journal of neurophysiology.
[37] J M Weisenberger,et al. Evaluation of two multichannel tactile aids for the hearing impaired. , 1989, The Journal of the Acoustical Society of America.