Perception of audiovisual speech synchrony for native and non-native language
暂无分享,去创建一个
Charles Spence | Salvador Soto-Faraco | Agnès Alsius | Jordi Navarra | C. Spence | J. Navarra | S. Soto-Faraco | A. Alsius | Ignacio Velasco | Ignacio Velasco
[1] S. Soto-Faraco,et al. Online processing of native and non-native phonemic contrasts in early bilinguals , 1999, Cognition.
[2] Salvador Soto-Faraco,et al. Assessing the role of attention in the audiovisual integration of speech , 2010, Inf. Fusion.
[3] P. Bertelson,et al. Recalibration of temporal order perception by exposure to audio-visual asynchrony. , 2004, Brain research. Cognitive brain research.
[4] K. Sekiyama,et al. Cultural and linguistic factors in audiovisual speech processing: The McGurk effect in Chinese subjects , 1997, Perception & psychophysics.
[5] L Polka,et al. A cross-language comparison of /d/-/th/ perception: evidence for a new developmental pattern. , 2001, The Journal of the Acoustical Society of America.
[6] Haim Sompolinsky,et al. A Hebbian learning rule mediates asymmetric plasticity in aligning sensory representations. , 2008, Journal of neurophysiology.
[7] Jean Vroomen,et al. Neural Correlates of Multisensory Integration of Ecologically Valid Audiovisual Events , 2007, Journal of Cognitive Neuroscience.
[8] David Poeppel,et al. Visual speech speeds up the neural processing of auditory speech. , 2005, Proceedings of the National Academy of Sciences of the United States of America.
[9] C. Spence,et al. Audio-visual simultaneity judgments , 2005, Perception & psychophysics.
[10] C. Spence,et al. Adaptation to audiovisual asynchrony modulates the speeded detection of sound , 2009, Proceedings of the National Academy of Sciences.
[11] Salvador Soto-Faraco,et al. Conscious access to the unisensory components of a cross-modal illusion , 2007, Neuroreport.
[12] Holger Mitterer,et al. Coping with phonological assimilation in speech perception: Evidence for early compensation , 2003, Perception & psychophysics.
[13] Charles Spence,et al. ‘When Birds of a Feather Flock Together’: Synesthetic Correspondences Modulate Audiovisual Integration in Non-Synesthetes , 2009, PloS one.
[14] H. McGurk,et al. Hearing lips and seeing voices , 1976, Nature.
[15] M. Wallace,et al. Early experience determines how the senses will interact. , 2007, Journal of neurophysiology.
[16] W. Marslen-Wilson,et al. The temporal structure of spoken language understanding , 1980, Cognition.
[17] P F Seitz,et al. The use of visible speech cues for improving auditory detection of spoken sentences. , 2000, The Journal of the Acoustical Society of America.
[18] D. Burnham,et al. Impact of language on development of auditory-visual speech perception. , 2008, Developmental science.
[19] John J. Foxe,et al. Do you see what I am saying? Exploring visual enhancement of speech comprehension in noisy environments. , 2006, Cerebral cortex.
[20] C. Spence,et al. Evaluating the influence of the 'unity assumption' on the temporal perception of realistic audiovisual stimuli. , 2008, Acta psychologica.
[21] Y. Tohkura,et al. Inter-language differences in the influence of visual cues in speech perception. , 1993 .
[22] B. Weintraub,et al. Secretion of hCG-α subunit and hCG by HeLa strains , 1977, Nature.
[23] R. Campbell,et al. Audiovisual Integration of Speech Falters under High Attention Demands , 2005, Current Biology.
[24] C. Spence,et al. Audiovisual Temporal Integration for Complex Speech, Object-Action, Animal Call, and Musical Stimuli , 2010 .
[25] S. Soto-Faraco,et al. Deconstructing the McGurk-MacDonald illusion. , 2009, Journal of experimental psychology. Human perception and performance.
[26] I. J. Myung,et al. Tutorial on maximum likelihood estimation , 2003 .
[27] Micah M. Murray,et al. The brain uses single-trial multisensory memories to discriminate without awareness , 2005, NeuroImage.
[28] J. Kaiser,et al. Multisensory Object Perception in the Primate Brain , 2010 .
[29] D. Whitaker,et al. Recalibration of perceived time across sensory modalities , 2008, Experimental Brain Research.
[30] J. Navarra,et al. Hearing lips in a second language: visual articulatory information enables the perception of second language sounds , 2007, Psychological research.
[31] Charles Spence,et al. Adaptation to audiotactile asynchrony , 2007, Neuroscience Letters.
[32] N. F. Dixon,et al. The Detection of Auditory Visual Desynchrony , 1980, Perception.
[33] C. Spence,et al. Crossmodal binding: Evaluating the “unity assumption” using audiovisual speech stimuli , 2007, Perception & psychophysics.
[34] Michael Kubovy,et al. Causality and cross-modal integration. , 2009, Journal of experimental psychology. Human perception and performance.
[35] G. McRoberts,et al. Infant Perception of Non-Native Consonant Contrasts that Adults Assimilate in Different Ways , 2003, Language and speech.
[36] N. Sebastián-Gallés,et al. The Influence of Initial Exposure on Lexical Representation: Comparing Early and Simultaneous Bilinguals. , 2005 .
[37] W. H. Sumby,et al. Visual contribution to speech intelligibility in noise , 1954 .
[38] David Poeppel,et al. Detection of auditory (cross-spectral) and auditory-visual (cross-modal) synchrony , 2004, Speech Commun..
[39] J. Werker,et al. Cross-language speech perception: Evidence for perceptual reorganization during the first year of life , 1984 .
[40] P. Gribble,et al. Temporal constraints on the McGurk effect , 1996, Perception & psychophysics.
[41] Charles Spence,et al. Semantic congruency and the Colavita visual dominance effect , 2008, Experimental Brain Research.
[42] F. Pollick,et al. When knowing can replace seeing in audiovisual integration of actions , 2009, Cognition.
[43] Salvador Soto-Faraco,et al. Attention to touch weakens audiovisual speech integration , 2007, Experimental Brain Research.
[44] C. Spence,et al. Multisensory prior entry. , 2001, Journal of experimental psychology. General.
[45] Salvador Soto-Faraco,et al. The perception of second language sounds in early bilinguals: new evidence from an implicit measure. , 2005, Journal of experimental psychology. Human perception and performance.
[46] Charles Spence,et al. Exposure to asynchronous audiovisual speech extends the temporal window for audiovisual integration. , 2005, Brain research. Cognitive brain research.
[47] S. Nishida,et al. Recalibration of audiovisual simultaneity , 2004, Nature Neuroscience.
[48] Charles Spence,et al. Temporal recalibration during asynchronous audiovisual speech perception , 2007, Experimental Brain Research.
[49] K. Stevens,et al. Linguistic experience alters phonetic perception in infants by 6 months of age. , 1992, Science.
[50] Whitney M. Weikum,et al. Visual Language Discrimination in Infancy , 2007, Science.
[51] A M Liberman,et al. Why are speech spectrograms hard to read? , 1968, American annals of the deaf.
[52] David Alais,et al. Perceptual synchrony of audiovisual streams for natural and artificial motion sequences. , 2006, Journal of vision.
[53] C. Spence,et al. Audiovisual synchrony perception for music, speech, and object actions , 2006, Brain Research.
[54] Athena Vouloumanos,et al. Discriminating languages by speech-reading , 2007, Perception & psychophysics.
[55] Jean Vroomen,et al. Visual Anticipatory Information Modulates Multisensory Interactions of Artificial Audiovisual Stimuli , 2010, Journal of Cognitive Neuroscience.