Effect of attentional load on audiovisual speech perception: evidence from ERPs
暂无分享,去创建一个
[1] K. Tiippana. What is the McGurk effect? , 2014, Front. Psychol..
[2] M. Murray,et al. Multisensory Integration: Flexible Use of General Operations , 2014, Neuron.
[3] J. Vroomen,et al. Electrophysiological evidence for speech-specific audiovisual integration , 2014, Neuropsychologia.
[4] A. Karmiloff-Smith,et al. Audio-visual speech perception: a developmental ERP investigation , 2013, Developmental science.
[5] S. Soto-Faraco,et al. Neural correlates of audiovisual speech processing in a second language , 2013, Brain and Language.
[6] S. Soto-Faraco,et al. Visual information constrains early and late stages of spoken-word recognition in sentence context. , 2013, International journal of psychophysiology : official journal of the International Organization of Psychophysiology.
[7] Frédéric Berthommier,et al. Binding and unbinding the auditory and visual streams in the McGurk effect. , 2012, The Journal of the Acoustical Society of America.
[8] K. Munhall,et al. The Influence of Selective Attention to Auditory and Visual Speech on the Integration of Audiovisual Speech Information , 2011, Perception.
[9] Luc H. Arnal,et al. Transitions in neural oscillations reflect prediction errors generated in audiovisual speech , 2011, Nature Neuroscience.
[10] Salvador Soto-Faraco,et al. Searching for audiovisual correspondence in multiple speaker scenarios , 2011, Experimental Brain Research.
[11] D. Senkowski,et al. The multifaceted interplay between attention and multisensory integration , 2010, Trends in Cognitive Sciences.
[12] Michael Pilling. Auditory event-related potentials (ERPs) in audiovisual speech perception. , 2009, Journal of speech, language, and hearing research : JSLHR.
[13] D. Lewkowicz,et al. Narrowing of intersensory speech perception in infancy , 2009, Proceedings of the National Academy of Sciences.
[14] Lynne E. Bernstein,et al. Mismatch Negativity with Visual-only and Audiovisual Speech , 2009, Brain Topography.
[15] S. Soto-Faraco,et al. Deconstructing the McGurk-MacDonald illusion. , 2009, Journal of experimental psychology. Human perception and performance.
[16] Emiliano Macaluso,et al. Spatial attention can modulate audiovisual integration at multiple cortical and subcortical sites , 2009, The European journal of neuroscience.
[17] Mikko Sams,et al. The role of visual spatial attention in audiovisual speech perception , 2009, Speech Commun..
[18] Sidney S. Simon,et al. Merging of the Senses , 2008, Front. Neurosci..
[19] J. Driver,et al. Multisensory Interplay Reveals Crossmodal Influences on ‘Sensory-Specific’ Brain Regions, Neural Responses, and Judgments , 2008, Neuron.
[20] Lynne E. Bernstein,et al. Spatiotemporal dynamics of audiovisual speech processing , 2008, NeuroImage.
[21] Jean Vroomen,et al. Neural Correlates of Multisensory Integration of Ecologically Valid Audiovisual Events , 2007, Journal of Cognitive Neuroscience.
[22] Salvador Soto-Faraco,et al. Attention to touch weakens audiovisual speech integration , 2007, Experimental Brain Research.
[23] Salvador Soto-Faraco,et al. Conscious access to the unisensory components of a cross-modal illusion , 2007, Neuroreport.
[24] John J. Foxe,et al. Seeing voices: High-density electrical mapping and source-analysis of the multisensory mismatch negativity evoked during the McGurk illusion , 2007, Neuropsychologia.
[25] M. Woldorff,et al. Selective attention and audiovisual integration: is attending to both modalities a prerequisite for early integration? , 2006, Cerebral cortex.
[26] Shin'ya Nishida,et al. Visual search for a target changing in synchrony with an auditory signal , 2006, Proceedings of the Royal Society B: Biological Sciences.
[27] H. Bounameaux,et al. Ten years after… , 2006, Journal of thrombosis and haemostasis : JTH.
[28] Daniel Senkowski,et al. Multisensory processing and oscillatory gamma responses: effects of spatial selective attention , 2005, Experimental Brain Research.
[29] Marty G. Woldorff,et al. Selective Attention and Multisensory Integration: Multiple Phases of Effects on the Evoked Brain Activity , 2005, Journal of Cognitive Neuroscience.
[30] R. Campbell,et al. Audiovisual Integration of Speech Falters under High Attention Demands , 2005, Current Biology.
[31] Karl J. Friston,et al. A theory of cortical responses , 2005, Philosophical Transactions of the Royal Society B: Biological Sciences.
[32] David Poeppel,et al. Visual speech speeds up the neural processing of auditory speech. , 2005, Proceedings of the National Academy of Sciences of the United States of America.
[33] Denis Burnham,et al. Auditory-visual speech integration by prelinguistic infants: perception of an emergent consonant in the McGurk effect. , 2004, Developmental psychobiology.
[34] A. Fort,et al. Bimodal speech: early suppressive visual effects in human auditory cortex , 2004, The European journal of neuroscience.
[35] J. Navarra,et al. Assessing automaticity in audiovisual speech integration: evidence from the speeded classification task , 2004, Cognition.
[36] M. Sams,et al. Time course of multisensory interactions during audiovisual speech perception in humans: a magnetoencephalographic study , 2004, Neuroscience Letters.
[37] Tobias S. Andersen,et al. Visual attention modulates audiovisual speech perception , 2004 .
[38] Ryusuke Kakigi,et al. Interaction between auditory and visual stimulus relating to the vowel sounds in the auditory cortex in humans: a magnetoencephalographic study , 2004, Neuroscience Letters.
[39] M. Sams,et al. Electrophysiological indicators of phonetic and non-phonetic multisensory interactions during audiovisual speech perception. , 2003, Brain research. Cognitive brain research.
[40] N. Logothetis,et al. Neuroperception: Facial expressions linked to monkey calls , 2003, Nature.
[41] John J. Foxe,et al. Multisensory auditory-visual interactions during early sensory processing in humans: a high-density electrical mapping study. , 2002, Brain research. Cognitive brain research.
[42] S A Hillyard,et al. An analysis of audio-visual crossmodal integration by means of event-related potential (ERP) recordings. , 2002, Brain research. Cognitive brain research.
[43] Mikko Sams,et al. Processing of changes in visual speech in the human auditory cortex. , 2002, Brain research. Cognitive brain research.
[44] P. Deltenre,et al. Mismatch negativity evoked by the McGurk–MacDonald effect: a phonetic representation within short-term memory , 2002, Clinical Neurophysiology.
[45] C. Frith,et al. Shifting baselines in attention research , 2000, Nature Reviews Neuroscience.
[46] C. Spence,et al. Multisensory perception: Beyond modularity and convergence , 2000, Current Biology.
[47] C. Frith,et al. Modulation of human visual cortex by crossmodal spatial attention. , 2000, Science.
[48] R. Campbell,et al. Evidence from functional magnetic resonance imaging of crossmodal binding in the human heteromodal cortex , 2000, Current Biology.
[49] M. Giard,et al. Auditory-Visual Integration during Multimodal Object Recognition in Humans: A Behavioral and Electrophysiological Study , 1999, Journal of Cognitive Neuroscience.
[50] E. Bullmore,et al. Response amplification in sensory-specific cortices during crossmodal binding. , 1999, Neuroreport.
[51] G. Plant. Perceiving Talking Faces: From Speech Perception to a Behavioral Principle , 1999 .
[52] L. Rosenblum,et al. An audiovisual test of kinematic primitives for visual speech perception. , 1996, Journal of experimental psychology. Human perception and performance.
[53] R. Hari,et al. Seeing speech: visual information from lip movements modifies activity in the human auditory cortex , 1991, Neuroscience Letters.
[54] D. Guthrie,et al. Significance testing of difference potentials. , 1991, Psychophysiology.
[55] M. Scherg,et al. Evoked dipole source potentials of the human auditory cortex. , 1986, Electroencephalography and clinical neurophysiology.
[56] A. Meltzoff,et al. The bimodal perception of speech in infancy. , 1982, Science.
[57] R. Näätänen. Processing negativity: an evoked-potential reflection of selective attention. , 1982, Psychological bulletin.
[58] J. G. Snodgrass,et al. A standardized set of 260 pictures: norms for name agreement, image agreement, familiarity, and visual complexity. , 1980, Journal of experimental psychology. Human learning and memory.
[59] H. McGurk,et al. Visual influences on speech perception processes , 1978, Perception & psychophysics.
[60] B. Weintraub,et al. Secretion of hCG-α subunit and hCG by HeLa strains , 1977, Nature.
[61] H. McGurk,et al. Hearing lips and seeing voices , 1976, Nature.
[62] S. Hillyard,et al. Electrical Signs of Selective Attention in the Human Brain , 1973, Science.
[63] W. Ritter,et al. The sources of auditory evoked responses recorded from the human scalp. , 1970, Electroencephalography and clinical neurophysiology.
[64] D. Lewkowicz,et al. The development of audiovisual speech perception , 2012 .
[65] Kevin G Munhall,et al. The effect of a concurrent working memory task and temporal offsets on the integration of auditory and visual speech information. , 2012, Seeing and perceiving.
[66] Susan Meyer Goldstein,et al. Ten years after: Interference of hospital slack in process performance benefits of quality practices , 2012 .
[67] Mikko Sams,et al. Sound location can influence audiovisual speech perception when spatial attention is manipulated. , 2011, Seeing and perceiving.
[68] Klucharev Vasily,et al. Electrophysiological indicators of phonetic and non-phonetic multisensory interactions during audiovisual speech perception. , 2003 .
[69] Risto N t nen. Processing negativity: An evoked-potential reflection. , 1982 .
[70] S. Hillyard,et al. Human auditory evoked potentials. I. Evaluation of components. , 1974, Electroencephalography and clinical neurophysiology.