Noise alters beta-band activity in superior temporal cortex during audiovisual speech processing
暂无分享,去创建一个
Andreas K. Engel | Till R. Schneider | Joerg F. Hipp | Daniel Senkowski | Inga M. Schepers | A. Engel | D. Senkowski | T. Schneider | J. Hipp
[1] O. Bertrand,et al. Visual Activation and Audiovisual Interactions in the Auditory Cortex during Speech Perception: Intracranial Recordings in Humans , 2008, The Journal of Neuroscience.
[2] W. H. Sumby,et al. Visual contribution to speech intelligibility in noise , 1954 .
[3] Wei Ji Ma,et al. Lip-Reading Aids Word Recognition Most in Moderate Noise: A Bayesian Explanation Using High-Dimensional Feature Space , 2009, PloS one.
[4] Gregory McCarthy,et al. Polysensory interactions along lateral temporal regions evoked by audiovisual speech. , 2003, Cerebral cortex.
[5] R. Romo,et al. Beta oscillations in the monkey sensorimotor network reflect somatosensory decision making , 2011, Proceedings of the National Academy of Sciences.
[6] N. Logothetis,et al. Frontiers in Integrative Neuroscience Integrative Neuroscience Directed Interactions between Auditory and Superior Temporal Cortices and Their Role in Sensory Integration , 2022 .
[7] K. Reinikainen,et al. Selective attention enhances the auditory 40-Hz transient response in humans , 1993, Nature.
[8] A. Labarga,et al. Gamma band activity in an auditory oddball paradigm studied with the wavelet transform , 2001, Clinical Neurophysiology.
[9] J. Rieger,et al. Audiovisual Temporal Correspondence Modulates Human Multisensory Superior Temporal Sulcus Plus Primary Sensory Cortices , 2007, The Journal of Neuroscience.
[10] Joachim Gross,et al. Phase-Locked Responses to Speech in Human Auditory Cortex are Enhanced During Comprehension , 2012, Cerebral cortex.
[11] R. Oostenveld,et al. Tactile Spatial Attention Enhances Gamma-Band Activity in Somatosensory Cortex and Reduces Low-Frequency Activity in Parieto-Occipital Areas , 2006, The Journal of Neuroscience.
[12] G. Ermentrout,et al. Gamma rhythms and beta rhythms have different synchronization properties. , 2000, Proceedings of the National Academy of Sciences of the United States of America.
[13] R V Shannon,et al. Speech Recognition with Primarily Temporal Cues , 1995, Science.
[14] Audrey R. Nath,et al. Dynamic Changes in Superior Temporal Sulcus Connectivity during Perception of Noisy Audiovisual Speech , 2011, The Journal of Neuroscience.
[15] Miles A. Whittington,et al. Human Neuroscience , 2022 .
[16] Y. Benjamini,et al. Controlling the false discovery rate: a practical and powerful approach to multiple testing , 1995 .
[17] R. Campbell,et al. Evidence from functional magnetic resonance imaging of crossmodal binding in the human heteromodal cortex , 2000, Current Biology.
[18] John J. Foxe,et al. Crossmodal binding through neural coherence: implications for multisensory processing , 2008, Trends in Neurosciences.
[19] J Gross,et al. REPRINTS , 1962, The Lancet.
[20] I. Nelken,et al. Transient Induced Gamma-Band Response in EEG as a Manifestation of Miniature Saccades , 2008, Neuron.
[21] J. Obleser,et al. Auditory evoked fields differentially encode speech features: an MEG investigation of the P50m and N100m time courses during syllable processing , 2007, The European journal of neuroscience.
[22] A. Engel,et al. Emotional Facial Expressions Modulate Pain-Induced Beta and Gamma Oscillations in Sensorimotor Cortex , 2011, The Journal of Neuroscience.
[23] N. Barbaro,et al. Spatiotemporal Dynamics of Word Processing in the Human Brain , 2007, Front. Neurosci..
[24] M. Murray,et al. EEG source imaging , 2004, Clinical Neurophysiology.
[25] Jeffery A. Jones,et al. Neural processes underlying perceptual enhancement by visual speech gestures , 2003, Neuroreport.
[26] Andreas K. Engel,et al. Oscillatory Synchronization in Large-Scale Cortical Networks Predicts Perception , 2011, Neuron.
[27] Uta Noppeney,et al. Physical and Perceptual Factors Shape the Neural Mechanisms That Integrate Audiovisual Signals in Speech Comprehension , 2011, The Journal of Neuroscience.
[28] Terrence J. Sejnowski,et al. An Information-Maximization Approach to Blind Separation and Blind Deconvolution , 1995, Neural Computation.
[29] F. Perrin,et al. Tonotopic organization of the human auditory cortex: N100 topography and multiple dipole model analysis. , 1995, Electroencephalography and clinical neurophysiology.
[30] John J. Foxe,et al. Do you see what I am saying? Exploring visual enhancement of speech comprehension in noisy environments. , 2006, Cerebral cortex.
[31] N. Kraus,et al. What subcortical–cortical relationships tell us about processing speech in noise , 2011, The European journal of neuroscience.
[32] Lynne E. Bernstein,et al. Auditory speech detection in noise enhanced by lipreading , 2004, Speech Commun..
[33] D. Poeppel,et al. Auditory Cortex Tracks Both Auditory and Visual Stimulus Dynamics Using Low-Frequency Neuronal Phase Modulation , 2010, PLoS biology.
[34] Luc H. Arnal,et al. Transitions in neural oscillations reflect prediction errors generated in audiovisual speech , 2011, Nature Neuroscience.
[35] S Makeig,et al. Human auditory evoked gamma-band magnetic fields. , 1991, Proceedings of the National Academy of Sciences of the United States of America.
[36] David Poeppel,et al. Visual speech speeds up the neural processing of auditory speech. , 2005, Proceedings of the National Academy of Sciences of the United States of America.
[37] Kensuke Sekihara,et al. Modified beamformers for coherent source region suppression , 2006, IEEE Transactions on Biomedical Engineering.
[38] B. Stein,et al. Interactions among converging sensory inputs in the superior colliculus. , 1983, Science.
[39] Jean Vroomen,et al. Neural Correlates of Multisensory Integration of Ecologically Valid Audiovisual Events , 2007, Journal of Cognitive Neuroscience.
[40] Masaaki Nishida,et al. Cortical gamma-oscillations modulated by listening and overt repetition of phonemes , 2010, NeuroImage.
[41] Marty G. Woldorff,et al. Selective Attention and Multisensory Integration: Multiple Phases of Effects on the Evoked Brain Activity , 2005, Journal of Cognitive Neuroscience.
[42] E. Miller,et al. Top-Down Versus Bottom-Up Control of Attention in the Prefrontal and Posterior Parietal Cortices , 2007, Science.
[43] D. M. Green,et al. Signal detection theory and psychophysics , 1966 .
[44] David Poeppel,et al. Cortical oscillations and speech processing: emerging computational principles and operations , 2012, Nature Neuroscience.
[45] John J. Foxe,et al. Impaired multisensory processing in schizophrenia: Deficits in the visual enhancement of speech comprehension under noisy environmental conditions , 2007, Schizophrenia Research.
[46] P. Mitra,et al. Analysis of dynamic brain imaging data. , 1998, Biophysical journal.
[47] E. Bullmore,et al. Activation of auditory cortex during silent lipreading. , 1997, Science.
[48] Jonas Obleser,et al. Magnetic Brain Response Mirrors Extraction of Phonological Features from Spoken Vowels , 2004, Journal of Cognitive Neuroscience.
[49] David M. Groppe,et al. Mass univariate analysis of event-related brain potentials/fields I: a critical tutorial review. , 2011, Psychophysiology.
[50] S. Bressler,et al. Response preparation and inhibition: The role of the cortical sensorimotor beta rhythm , 2008, Neuroscience.
[51] Robert Oostenveld,et al. Imaging the human motor system’s beta-band synchronization during isometric contraction , 2008, NeuroImage.
[52] Daniel Senkowski,et al. Good times for multisensory integration: Effects of the precision of temporal synchrony as revealed by gamma-band oscillations , 2007, Neuropsychologia.
[53] Luc H. Arnal,et al. Dual Neural Routing of Visual Facilitation in Speech Processing , 2009, The Journal of Neuroscience.
[54] Asif A Ghazanfar,et al. Different neural frequency bands integrate faces and voices differently in the superior temporal sulcus. , 2009, Journal of neurophysiology.
[55] Bijan Pesaran,et al. Free choice activates a decision circuit between frontal and parietal cortex , 2008, Nature.
[56] Asif A Ghazanfar,et al. Interactions between the Superior Temporal Sulcus and Auditory Cortex Mediate Dynamic Face/Voice Integration in Rhesus Monkeys , 2008, The Journal of Neuroscience.
[57] Jeffery A. Jones,et al. Multisensory Integration Sites Identified by Perception of Spatial Wavelet Filtered Visual Speech Gesture Information , 2004, Journal of Cognitive Neuroscience.
[58] Andreas K. Engel,et al. Gamma-band activity reflects multisensory matching in working memory , 2009, Experimental Brain Research.
[59] Friedemann Pulvermüller,et al. Ultra-rapid access to words in the brain , 2012, Nature Communications.
[60] Ryan A. Stevenson,et al. Audiovisual integration in human superior temporal sulcus: Inverse effectiveness and the neural processing of speech and object recognition , 2009, NeuroImage.
[61] C. Schroeder,et al. Neuronal Oscillations and Multisensory Interaction in Primary Auditory Cortex , 2007, Neuron.
[62] Daniel Senkowski,et al. Gamma-Band Activity as a Signature for Cross-Modal Priming of Auditory Object Recognition by Active Haptic Exploration , 2011, The Journal of Neuroscience.
[63] A. Puce,et al. Neuronal oscillations and visual amplification of speech , 2008, Trends in Cognitive Sciences.
[64] A. Fort,et al. Bimodal speech: early suppressive visual effects in human auditory cortex , 2004, The European journal of neuroscience.
[65] Lee M. Miller,et al. Behavioral/systems/cognitive Perceptual Fusion and Stimulus Coincidence in the Cross- Modal Integration of Speech , 2022 .
[66] I. Winkler,et al. The concept of auditory stimulus representation in cognitive neuroscience. , 1999, Psychological bulletin.
[67] G. Pfurtscheller. Central beta rhythm during sensorimotor activities in man. , 1981, Electroencephalography and clinical neurophysiology.
[68] W. Drongelen,et al. Localization of brain electrical activity via linearly constrained minimum variance spatial filtering , 1997, IEEE Transactions on Biomedical Engineering.
[69] U. Noppeney,et al. Distinct Functional Contributions of Primary Sensory and Association Areas to Audiovisual Integration in Object Categorization , 2010, The Journal of Neuroscience.
[70] Fan-Gang Zeng,et al. Speech recognition with amplitude and frequency modulations. , 2005, Proceedings of the National Academy of Sciences of the United States of America.
[71] D. Poeppel,et al. Phase Patterns of Neuronal Responses Reliably Discriminate Speech in Human Auditory Cortex , 2007, Neuron.
[72] John J. Foxe,et al. Multisensory interactions in early evoked brain activity follow the principle of inverse effectiveness , 2011, NeuroImage.
[73] A. Engel,et al. Neuronal Synchronization along the Dorsal Visual Pathway Reflects the Focus of Spatial Attention , 2008, Neuron.
[74] Audrey R. Nath,et al. fMRI-Guided Transcranial Magnetic Stimulation Reveals That the Superior Temporal Sulcus Is a Cortical Locus of the McGurk Effect , 2010, The Journal of Neuroscience.
[75] M. Sams,et al. Time course of multisensory interactions during audiovisual speech perception in humans: a magnetoencephalographic study , 2004, Neuroscience Letters.
[76] Robert Oostenveld,et al. FieldTrip: Open Source Software for Advanced Analysis of MEG, EEG, and Invasive Electrophysiological Data , 2010, Comput. Intell. Neurosci..
[77] Evan Balaban,et al. Multivariate activation and connectivity patterns discriminate speech intelligibility in Wernicke's, Broca's, and Geschwind's areas. , 2013, Cerebral cortex.
[78] Brigitte Röder,et al. A new method for detecting interactions between the senses in event-related potentials , 2006, Brain Research.
[79] B. Argall,et al. Integration of Auditory and Visual Information about Objects in Superior Temporal Sulcus , 2004, Neuron.
[80] A. Engel,et al. Beta-band oscillations—signalling the status quo? , 2010, Current Opinion in Neurobiology.
[81] Asif A Ghazanfar,et al. Dynamic, rhythmic facial expressions and the superior temporal sulcus of macaque monkeys: implications for the evolution of audiovisual speech , 2010, The European journal of neuroscience.
[82] S A Hillyard,et al. An analysis of audio-visual crossmodal integration by means of event-related potential (ERP) recordings. , 2002, Brain research. Cognitive brain research.
[83] Asif A. Ghazanfar,et al. Monkeys and Humans Share a Common Computation for Face/Voice Integration , 2011, PLoS Comput. Biol..
[84] Jonas Obleser,et al. Suppressed alpha oscillations predict intelligibility of speech and its acoustic details. , 2012, Cerebral cortex.
[85] Peter Brown,et al. Boosting Cortical Activity at Beta-Band Frequencies Slows Movement in Humans , 2009, Current Biology.