Turning a blind eye to the lexicon: ERPs show no cross-talk between lip-read and lexical context during speech sound processing

[1]  Martijn Baart,et al.  Early processing of auditory lexical predictions revealed by ERPs , 2015, Neuroscience Letters.

[2]  J. Schwartz,et al.  A possible neurophysiological correlate of audiovisual binding and unbinding in speech perception , 2014, Front. Psychol..

[3]  M. Sams,et al.  Effect of attentional load on audiovisual speech perception: evidence from ERPs , 2014, Front. Psychol..

[4]  Marc Sato,et al.  The sound of your lips: electrophysiological cross-modal interactions during hand-to-face and face-to-face speech perception , 2014, Front. Psychol..

[5]  Arthur G Samuel,et al.  Visual speech acts differently than lexical context in supporting speech perception. , 2014, Journal of experimental psychology. Human perception and performance.

[6]  J. Vroomen,et al.  Electrophysiological evidence for speech-specific audiovisual integration , 2014, Neuropsychologia.

[7]  Saul Sternberg,et al.  The meaning of additive reaction-time effects: some misconceptions , 2013, Front. Psychol..

[8]  L. Granjon,et al.  Seeing the initial articulatory gestures of a word triggers lexical access , 2013 .

[9]  S. Soto-Faraco,et al.  Visual information constrains early and late stages of spoken-word recognition in sentence context. , 2013, International journal of psychophysiology : official journal of the International Organization of Psychophysiology.

[10]  A. Duchon,et al.  EsPal: One-stop shopping for Spanish word properties , 2013, Behavior research methods.

[11]  Axel H. Winneke,et al.  ERP evidence that auditory-visual speech facilitates working memory in younger and older adults. , 2013, Psychology and aging.

[12]  J. Vroomen,et al.  Electrophysiological correlates of predictive coding of auditory location in the perception of natural audiovisual events , 2012, Front. Integr. Neurosci..

[13]  J. Vroomen,et al.  Lipread-induced phonetic recalibration in dyslexia. , 2012, Acta psychologica.

[14]  Friedemann Pulvermüller,et al.  Ultra-rapid access to words in the brain , 2012, Nature Communications.

[15]  Axel H. Winneke,et al.  Does audiovisual speech offer a fountain of youth for old ears? An event-related brain potential study of age differences in audiovisual speech perception. , 2011, Psychology and aging.

[16]  Jean Vroomen,et al.  Visual Anticipatory Information Modulates Multisensory Interactions of Artificial Audiovisual Stimuli , 2010, Journal of Cognitive Neuroscience.

[17]  J. Vroomen,et al.  Phonetic recalibration does not depend on working memory , 2010, Experimental Brain Research.

[18]  Jean Vroomen,et al.  Recalibration of Phonetic Categories by Lipread Speech: Measuring Aftereffects After a 24-hour Delay , 2009, Language and speech.

[19]  Jean Vroomen,et al.  Phonetic recalibration only occurs in speech mode , 2009, Cognition.

[20]  A. Samuel,et al.  Accommodating variation: Dialects, idiolects, and speech processing , 2008, Cognition.

[21]  P. Hagoort The fractionation of spoken language understanding by measuring electrical and magnetic brain signals , 2008, Philosophical Transactions of the Royal Society B: Biological Sciences.

[22]  J. MacDonald,et al.  Hearing Lips and Seeing Voices: Illusion and Serendipity in Auditory‐Visual Perception Research , 2008 .

[23]  P. Kiely,et al.  When /b/ill with /g/ill becomes /d/ill: Evidence for a lexical effect in audiovisual speech perception , 2008 .

[24]  P. Bertelson,et al.  Visual recalibration and selective adaptation in auditory–visual speech perception: Contrasting build-up courses , 2007, Neuropsychologia.

[25]  Jean Vroomen,et al.  Neural Correlates of Multisensory Integration of Ecologically Valid Audiovisual Events , 2007, Journal of Cognitive Neuroscience.

[26]  Jyrki Tuomainen,et al.  Lexical effects on auditory speech perception: An electrophysiological study , 2007, Neuroscience Letters.

[27]  John J. Foxe,et al.  Seeing voices: High-density electrical mapping and source-analysis of the multisensory mismatch negativity evoked during the McGurk illusion , 2007, Neuropsychologia.

[28]  A. Samuel,et al.  Generalization in perceptual learning for speech , 2006, Psychonomic bulletin & review.

[29]  J. McQueen,et al.  Perceptual learning in speech: Stability over time (L) , 2006 .

[30]  A. Samuel,et al.  Perceptual learning for speech: Is there a return to normal? , 2005, Cognitive Psychology.

[31]  David Poeppel,et al.  Visual speech speeds up the neural processing of auditory speech. , 2005, Proceedings of the National Academy of Sciences of the United States of America.

[32]  A. Fort,et al.  Bimodal speech: early suppressive visual effects in human auditory cortex , 2004, The European journal of neuroscience.

[33]  Béatrice de Gelder,et al.  Selective adaptation and recalibration of auditory speech by lipread information: Dissipation , 2004, AVSP.

[34]  P. Deltenre,et al.  Generalization of the generation of an MMN by illusory McGurk percepts: voiceless consonants , 2004, Clinical Neurophysiology.

[35]  Alexandra Fort,et al.  Interest and validity of the additive model in electrophysiological studies of multisensory interactions , 2004, Cognitive Processing.

[36]  P. Hagoort,et al.  The Influence of Semantic and Syntactic Context Constraints on Lexical Selection and Integration in Spoken-Word Comprehension as Revealed by ERPs , 2004, Journal of Cognitive Neuroscience.

[37]  Lawrence Brancazio,et al.  Lexical influences in audiovisual speech perception. , 2004, Journal of experimental psychology. Human perception and performance.

[38]  Paavo Alku,et al.  Automatic Auditory Processing of English Words as Indexed by the Mismatch Negativity, Using a Multiple Deviant Paradigm , 2004, Ear and hearing.

[39]  P. Bertelson,et al.  Visual Recalibration of Auditory Speech Identification , 2003, Psychological science.

[40]  D. Norris,et al.  Perceptual learning in speech , 2003, Cognitive Psychology.

[41]  David Poeppel,et al.  The analysis of speech in different temporal integration windows: cerebral lateralization as 'asymmetric sampling in time' , 2003, Speech Commun..

[42]  F. Perrin,et al.  Modulation of the N400 potential during auditory phonological/semantic interaction. , 2003, Brain research. Cognitive brain research.

[43]  S. Shigeno Anchoring effects in audiovisual speech perception. , 2002, The Journal of the Acoustical Society of America.

[44]  J. Pernier,et al.  Early auditory-visual interactions in human cortex during nonredundant target identification. , 2002, Brain research. Cognitive brain research.

[45]  P. Deltenre,et al.  Mismatch negativity evoked by the McGurk–MacDonald effect: a phonetic representation within short-term memory , 2002, Clinical Neurophysiology.

[46]  Paavo Alku,et al.  Memory Traces for Words as Revealed by the Mismatch Negativity , 2001, NeuroImage.

[47]  A G Samuel,et al.  Knowing a Word Affects the Fundamental Perception of The Sounds Within it , 2001, Psychological science.

[48]  M. Giard,et al.  Auditory-Visual Integration during Multimodal Object Recognition in Humans: A Behavioral and Electrophysiological Study , 1999, Journal of Cognitive Neuroscience.

[49]  J. Kounios,et al.  Dual-coding, context-availability, and concreteness effects in sentence comprehension: an electrophysiological investigation. , 1999, Journal of experimental psychology. Learning, memory, and cognition.

[50]  Mikko Sams,et al.  McGurk effect in Finnish syllables, isolated words, and words in sentences: Effects of word meaning and sentence context , 1998, Speech Commun..

[51]  A. Samuel Lexical Activation Produces Potent Phonemic Percepts , 1997, Cognitive Psychology.

[52]  J. Connolly,et al.  Event-Related Potential Components Reflect Phonological and Semantic Processing of the Terminal Word of Spoken Sentences , 1994, Journal of Cognitive Neuroscience.

[53]  L D Rosenblum,et al.  Selective adaptation in speech perception using a compelling audiovisual adaptor. , 1994, The Journal of the Acoustical Society of America.

[54]  A G Samuel,et al.  An empirical and meta-analytic evaluation of the phoneme identification task. , 1993, Journal of experimental psychology. Human perception and performance.

[55]  J. Connolly,et al.  Event-related potential sensitivity to acoustic and semantic properties of terminal words in sentences , 1992, Brain and Language.

[56]  M. Funnell,et al.  Audiovisual integration in perception of real words , 1992, Perception & psychophysics.

[57]  H. Neville,et al.  Auditory and Visual Semantic Priming in Lexical Decision: A Comparison Using Event-related Brain Potentials , 1990 .

[58]  J. Connolly,et al.  The effects of processing requirements on neurophysiological responses to spoken sentences , 1990, Brain and Language.

[59]  A. Samuel Red herring detectors and speech perception: In defense of selective adaptation , 1986, Cognitive Psychology.

[60]  E Donchin,et al.  A new method for off-line removal of ocular artifact. , 1983, Electroencephalography and clinical neurophysiology.

[61]  R. D. Easton,et al.  Perceptual dominance during lipreading , 1982, Perception & psychophysics.

[62]  Q Summerfield,et al.  Audiovisual presentation demonstrates that selective adaptation in speech perception is purely auditory , 1981, Perception & psychophysics.

[63]  W. Ganong Phonetic categorization in auditory word perception. , 1980, Journal of experimental psychology. Human perception and performance.

[64]  R. Näätänen,et al.  Early selective-attention effect on evoked potential reinterpreted. , 1978, Acta psychologica.

[65]  E. Courchesne,et al.  Stimulus novelty, task relevance and the visual evoked potential in man. , 1975, Electroencephalography and clinical neurophysiology.

[66]  R. M. Warren Perceptual Restoration of Missing Speech Sounds , 1970, Science.

[67]  W. H. Sumby,et al.  Visual contribution to speech intelligibility in noise , 1954 .

[68]  James L. Morgan,et al.  When Hearing Lips and Seeing Voices Becomes Perceiving Speech: Auditory-Visual Integration in Lexical Access , 2011, CogSci.

[69]  Kara D. Federmeier,et al.  Thirty years and counting: finding meaning in the N400 component of the event-related brain potential (ERP). , 2011, Annual review of psychology.

[70]  A. Samuel,et al.  Perceptual adjustments to multiple speakers , 2007 .

[71]  J. Vroomen,et al.  Recalibration of phonetic categories by lipread speech versus lexical information. , 2007, Journal of experimental psychology. Human perception and performance.

[72]  J. McQueen,et al.  Perceptual learning in speech: stability over time. , 2006, The Journal of the Acoustical Society of America.

[73]  Klucharev Vasily,et al.  Electrophysiological indicators of phonetic and non-phonetic multisensory interactions during audiovisual speech perception. , 2003 .

[74]  Peter Hagoort,et al.  Electrophysiological evidence for early contextual influences during spoken-word recognition: The N2000 , 2000 .

[75]  A. Samuel Phonemic restoration: insights from a new methodology. , 1981, Journal of experimental psychology. General.

[76]  S. Holm A Simple Sequentially Rejective Multiple Test Procedure , 1979 .

[77]  P. D. Eimas,et al.  Selective adaptation of linguistic feature detectors , 1973 .

[78]  J. Brožek Attention and Performance II. , 1971 .

[79]  Saul Sternberg,et al.  The discovery of processing stages: Extensions of Donders' method , 1969 .