Influences of selective adaptation on perception of audiovisual speech
暂无分享,去创建一个
[1] H. McGurk,et al. Hearing lips and seeing voices , 1976, Nature.
[2] John F. Magnotti,et al. Variability and stability in the McGurk effect: contributions of participants, stimuli, time, and response type , 2015, Psychonomic bulletin & review.
[3] P. Arnold,et al. Bisensory augmentation: a speechreading advantage when speech is clearly audible and intact. , 2001, British journal of psychology.
[4] P. D. Eimas,et al. Selective adaptation of linguistic feature detectors , 1973 .
[5] Jennifer S. Pardo,et al. On phonetic convergence during conversational interaction. , 2006, The Journal of the Acoustical Society of America.
[6] Timothy R. Jordan,et al. Determining the influence of Gaussian blurring on inversion effects with talking faces , 2002, Perception & psychophysics.
[7] John J. Foxe,et al. Do you see what I am saying? Exploring visual enhancement of speech comprehension in noisy environments. , 2006, Cerebral cortex.
[8] M. Sams,et al. Primary auditory cortex activation by visual speech: an fMRI study at 3 T , 2005, Neuroreport.
[9] D. Massaro. Speech Perception By Ear and Eye: A Paradigm for Psychological Inquiry , 1989 .
[10] A G Samuel,et al. More adaptation of speech by nonspeech. , 1984, Journal of experimental psychology. Human perception and performance.
[11] S. Goldinger. Echoes of echoes? An episodic theory of lexical access. , 1998, Psychological review.
[12] David B. Pisoni,et al. Multimodal perceptual organization of speech: Evidence from tone analogs of spoken utterances , 1998, Speech Commun..
[13] David J. Ostry,et al. Somatosensory function in speech perception , 2009, Proceedings of the National Academy of Sciences.
[14] Q. Summerfield. Some preliminaries to a comprehensive account of audio-visual speech perception. , 1987 .
[15] N. P. Erber. Auditory-visual perception of speech. , 1975, The Journal of speech and hearing disorders.
[16] Ruben van de Vijver,et al. Pisoni, D., Remez, R. (eds.), The handbook of speech perception; Oxford, Blackwell, 2005 , 2009 .
[17] J. L. Miller,et al. On the role of visual rate information in phonetic perception , 1985, Perception & psychophysics.
[18] Jean Vroomen,et al. Phonetic recalibration in audiovisual speech , 2012 .
[19] A. Samuel. Lexical Activation Produces Potent Phonemic Percepts , 1997, Cognitive Psychology.
[20] L. Bernstein,et al. Audiovisual Speech Binding: Convergence or Association? , 2004 .
[21] A G Samuel,et al. Knowing a Word Affects the Fundamental Perception of The Sounds Within it , 2001, Psychological science.
[22] Peter D. Eimas,et al. Some properties of linguistic feature detectors , 1973 .
[23] W. Ganong. The selective adaptation effects of burst-cued stops , 1978, Perception & psychophysics.
[24] Lawrence D. Rosenblum,et al. Primacy of Multimodal Speech Perception , 2008 .
[25] Mikko Sams,et al. Seeing speech affects acoustic information processing in the human brainstem , 2005, Experimental Brain Research.
[26] R. Remez,et al. Perceptual Organization of Speech , 2008, The Handbook of Speech Perception.
[27] Ruth Campbell,et al. The processing of audio-visual speech: empirical and neural bases , 2008, Philosophical Transactions of the Royal Society B: Biological Sciences.
[28] L D Rosenblum,et al. Selective adaptation in speech perception using a compelling audiovisual adaptor. , 1994, The Journal of the Acoustical Society of America.
[29] Q Summerfield,et al. Audiovisual presentation demonstrates that selective adaptation in speech perception is purely auditory , 1981, Perception & psychophysics.
[30] Lawrence D. Rosenblum,et al. Alignment to visual speech information , 2010, Attention, perception & psychophysics.
[31] E. Vatikiotis-Bateson,et al. Eye movement of perceivers during audiovisualspeech perception , 1998, Perception & psychophysics.
[32] D E Callan,et al. Multimodal contribution to speech perception revealed by independent component analysis: a single-sweep EEG case study. , 2001, Brain research. Cognitive brain research.
[33] Jean Vroomen,et al. Do you see what you are hearing? Cross-modal effects of speech sounds on lipreading , 2010, Neuroscience Letters.
[34] C. Fowler,et al. Listening with eye and hand: Cross-modal contributions to speech perception. , 1991 .
[35] R. Hari,et al. Seeing speech: visual information from lip movements modifies activity in the human auditory cortex , 1991, Neuroscience Letters.
[36] A. Little,et al. Adaptation to different mouth shapes influences visual perception of ambiguous lip speech , 2010, Psychonomic bulletin & review.
[37] Mikko Sams,et al. Seeing and hearing others and oneself talk. , 2005, Brain research. Cognitive brain research.
[38] B. Gick,et al. Aero-tactile integration in speech perception , 2009, Nature.
[39] D. Reisberg,et al. Easy to hear but hard to understand: A lip-reading advantage with intact auditory stimuli. , 1987 .
[40] Lawrence D Rosenblum,et al. Speech Perception as a Multimodal Phenomenon , 2008, Current directions in psychological science.
[41] H. McGurk,et al. Visual influences on speech perception processes , 1978, Perception & psychophysics.
[42] W. H. Sumby,et al. Visual contribution to speech intelligibility in noise , 1954 .
[43] L. Rosenblum,et al. Discrimination tests of visually influenced syllables , 1992, Perception & psychophysics.
[44] P. Deltenre,et al. Mismatch negativity evoked by the McGurk–MacDonald effect: a phonetic representation within short-term memory , 2002, Clinical Neurophysiology.
[45] Arthur G Samuel,et al. Visual speech acts differently than lexical context in supporting speech perception. , 2014, Journal of experimental psychology. Human perception and performance.
[46] E. Bullmore,et al. Activation of auditory cortex during silent lipreading. , 1997, Science.
[47] R. Campbell,et al. Evidence from functional magnetic resonance imaging of crossmodal binding in the human heteromodal cortex , 2000, Current Biology.
[48] K. Green,et al. Perception of /r/ and /l/ in a stop cluster: evidence of cross-modal context effects. , 2001, Journal of experimental psychology. Human perception and performance.
[49] Antoine J. Shahin,et al. Tolerance for audiovisual asynchrony is enhanced by the spectrotemporal fidelity of the speaker’s mouth movements and speech , 2017, Language, cognition and neuroscience.