A comparison of models for fusion of the auditory and visual sensors in speech perception
暂无分享,去创建一个
Jean-Luc Schwartz | Pierre Escudier | Jordi Robert-Ribes | J. Robert-Ribes | J. Schwartz | P. Escudier
[1] G. Kramer. Auditory Scene Analysis: The Perceptual Organization of Sound by Albert Bregman (review) , 2016 .
[2] Jean-Luc Schwartz,et al. The prediction of vowel systems: perceptual contrast and stability , 1995 .
[3] C. Benoît,et al. Effects of phonetic context on audio-visual intelligibility of French. , 1994, Journal of speech and hearing research.
[4] R. Lynn,et al. A secular decline in the strength of Spearman’s g in Japan , 1994 .
[5] M. Radeau. Auditory-visual spatial interaction and modularity , 1994, Current psychology of cognition = Cahiers de psychologie cognitive : CPC.
[6] Dominic W. Massaro,et al. Perceiving asynchronous bimodal speech in consonant-vowel and vowel syllables , 1993, Speech Commun..
[7] Antoinette T. Gesi,et al. Bimodal speech perception: an examination across languages , 1993 .
[8] Y. Tohkura,et al. Inter-language differences in the influence of visual cues in speech perception. , 1993 .
[9] Jean-Luc Schwartz,et al. Integrating auditory and visual representations for audiovisual vowel recognition , 1993, EUROSPEECH.
[10] Yves Demazeau,et al. Principles and techniques for sensor data fusion , 1993, Signal Process..
[11] L. Lisker,et al. Auditory and Visual Cueing of the [±Rounded] Feature Of Vowels , 1992, Language and speech.
[12] J. Vroomen,et al. Abstract versus modality-specific memory representations in processing auditory and visual speech , 1992, Memory & cognition.
[13] Gregory J. Wolff,et al. Neural network lipreading system for improved speech recognition , 1992, [Proceedings 1992] IJCNN International Joint Conference on Neural Networks.
[14] Martin Beckerman,et al. A Bayes-maximum entropy method for multi-sensor data fusion , 1992, Proceedings 1992 IEEE International Conference on Robotics and Automation.
[15] Helen C. Shen,et al. Sensory data integration: a team consensus approach , 1992, Proceedings 1992 IEEE International Conference on Robotics and Automation.
[16] Q. Summerfield,et al. Lipreading and audio-visual speech perception. , 1992, Philosophical transactions of the Royal Society of London. Series B, Biological sciences.
[17] C. Fowler,et al. Listening with eye and hand: cross-modal contributions to speech perception. , 1991, Journal of experimental psychology. Human perception and performance.
[18] M H Goldstein,et al. Comparing human and neural network lip readers. , 1991, The Journal of the Acoustical Society of America.
[19] R. Hari,et al. Seeing speech: visual information from lip movements modifies activity in the human auditory cortex , 1991, Neuroscience Letters.
[20] P. Kuhl,et al. Integral processing of visual place and auditory voicing information during phonetic perception. , 1991, Journal of experimental psychology. Human perception and performance.
[21] Tomio Watanabe,et al. Lip-reading of Japanese vowels using neural networks , 1990, ICSLP.
[22] Terrence J. Sejnowski,et al. Neural network models of sensory integration for improved vowel recognition , 1990, Proc. IEEE.
[23] S. Arlinger,et al. Visual evoked potentials: relation to adult speechreading and cognitive function. , 1989, Journal of speech and hearing research.
[24] R. Welch. A comparison of speech perception and spatial localization , 1989 .
[25] D. Massaro. Speech Perception By Ear and Eye: A Paradigm for Psychological Inquiry , 1989 .
[26] B.P. Yuhas,et al. Integration of acoustic and visual speech signals using neural networks , 1989, IEEE Communications Magazine.
[27] D. Massaro. Testing between the TRACE model and the fuzzy logical model of speech perception , 1989, Cognitive Psychology.
[28] Allen A. Montgomery,et al. Automatic optically-based recognition of speech , 1988, Pattern Recognit. Lett..
[29] K. Green. The perception of speaking rate using visual information from a talker’s face , 1987, Perception & psychophysics.
[30] M. Konishi. Centrally synthesized maps of sensory space , 1986, Trends in Neurosciences.
[31] Louis D. Braida,et al. Use of articulatory signals in automatic speech recognition , 1986 .
[32] Olivier D. Faugeras,et al. Building visual maps by combining noisy stereo measurements , 1986, Proceedings. 1986 IEEE International Conference on Robotics and Automation.
[33] Christian Abry,et al. "Laws" for lips , 1986, Speech Commun..
[34] Kenji Kurosu,et al. Speech Recognition by Image Processing of Lip Movements , 1986 .
[35] R Plomp,et al. Speechreading supplemented with auditorily presented speech parameters. , 1986, The Journal of the Acoustical Society of America.
[36] A. Liberman,et al. The motor theory of speech perception revised , 1985, Cognition.
[37] J. L. Miller,et al. On the role of visual rate information in phonetic perception , 1985, Perception & psychophysics.
[38] P K Kuhl,et al. The contribution of fundamental frequency, amplitude envelope, and voicing duration cues to speechreading in normal-hearing subjects. , 1985, The Journal of the Acoustical Society of America.
[39] B. Delgutte,et al. Speech coding in the auditory nerve: IV. Sounds with consonant-like dynamic characteristics. , 1984, The Journal of the Acoustical Society of America.
[40] Q Summerfield,et al. Detection and Resolution of Audio-Visual Incompatibility in the Perception of Vowels , 1984, The Quarterly journal of experimental psychology. A, Human experimental psychology.
[41] B. Stein,et al. Interactions among converging sensory inputs in the superior colliculus. , 1983, Science.
[42] C M Reed,et al. Research on the Tadoma method of speech communication. , 1983, The Journal of the Acoustical Society of America.
[43] A. Meltzoff,et al. The bimodal perception of speech in infancy. , 1982, Science.
[44] E. Knudsen. Auditory and visual maps of space in the optic tectum of the owl , 1982, The Journal of neuroscience : the official journal of the Society for Neuroscience.
[45] Q Summerfield,et al. Audiovisual presentation demonstrates that selective adaptation in speech perception is purely auditory , 1981, Perception & psychophysics.
[46] Brian C. J. Moore,et al. Voice pitch as an aid to lipreading , 1981, Nature.
[47] B. Lindblom,et al. Modeling the judgment of vowel quality differences. , 1981, The Journal of the Acoustical Society of America.
[48] N. F. Dixon,et al. The Detection of Auditory Visual Desynchrony , 1980, Perception.
[49] R. Campbell,et al. Hearing by Eye , 1980, The Quarterly journal of experimental psychology.
[50] Barbara Dodd,et al. Lip reading in infants: Attention to speech presented in- and out-of-synchrony , 1979, Cognitive Psychology.
[51] Q Summerfield,et al. Use of Visual Information for Phonetic Perception , 1979, Phonetica.
[52] Dennis H. Klatt,et al. Speech perception: a model of acoustic–phonetic analysis and lexical access , 1979 .
[53] D C Shepherd,et al. Visual-neural correlate of speechreading ability in normal-hearing adults. , 1977, Journal of speech and hearing research.
[54] Dominic W. Massaro,et al. Dividing attention between auditory and visual perception , 1977 .
[55] H. McGurk,et al. Hearing lips and seeing voices , 1976, Nature.
[56] N. P. Erber. Auditory-visual perception of speech. , 1975, The Journal of speech and hearing disorders.
[57] A A Montgomery,et al. Auditory and visual contributions to the perception of consonants. , 1974, Journal of speech and hearing research.
[58] N. P. Erber. Interaction of audition and vision in the recognition of oral speech stimuli. , 1969, Journal of speech and hearing research.
[59] W. H. Sumby,et al. Erratum: Visual Contribution to Speech Intelligibility in Noise [J. Acoust. Soc. Am. 26, 212 (1954)] , 1954 .
[60] W. H. Sumby,et al. Visual contribution to speech intelligibility in noise , 1954 .
[61] A. J. King,et al. Integration of visual and auditory information in bimodal neurones in the guinea-pig superior colliculus , 2004, Experimental Brain Research.
[62] J. Vroomen,et al. Hearing Voices and Seeing Lips. Investigations in the Psychology of Lipreading , 1992 .
[63] A. Morris. Analyse informationnelle du traitement de la parole dans le système auditif périphérique et le noyau cochléaire : application à la reconnaissance des occlusives voisées du français , 1992 .
[64] Mohamed Tahar Lallouache,et al. Un poste "visage-parole" couleur : acquisition et traitement automatique des contours des lèvres , 1991 .
[65] J. Vroomen,et al. Face recognition and lip-reading in autism , 1991 .
[66] Jean Piaget,et al. Traité de psychologie expérimentale , 1991 .
[67] P K Kuhl,et al. The role of visual information in the processing of , 1989, Perception & psychophysics.
[68] Ruth Campbell,et al. Tracing Lip Movements: Making Speech Visible , 1988 .
[69] A. Macleod,et al. Quantifying the contribution of vision to speech perception in noise. , 1987, British journal of audiology.
[70] Q. Summerfield. Some preliminaries to a comprehensive account of audio-visual speech perception. , 1987 .
[71] D. Reisberg,et al. Easy to hear but hard to understand: A lip-reading advantage with intact auditory stimuli. , 1987 .
[72] C. Fowler. An event approach to the study of speech perception from a direct realist perspective , 1986 .
[73] Eric David Petajan,et al. Automatic Lipreading to Enhance Speech Recognition (Speech Reading) , 1984 .
[74] Björn Lindblom,et al. Frontiers of speech communication research , 1979 .
[75] H. W. Campbell,et al. Phoneme recognition by ear and by eye: a distinctive feature analysis , 1974 .