Understanding requires tracking: noise and knowledge interact in bilingual comprehension

Understanding speech in noise is a fundamental challenge for speech comprehension. This perceptual demand is amplified in a second language: it is a common experience in bars, train stations, and other noisy environments that degraded signal quality severely compromises second language comprehension. Through a novel design, paired with a carefully selected participant profile, we independently assessed signal-driven and knowledge-driven contributions to the brain bases of first versus second language processing. The neurophysiological data show that the locus of bilinguals’ difficulty in understanding second language speech in noisy conditions arises from a failure to successfully perform a basic, automatic, low-level process: cortical entrainment to speech signals above the syllabic level.

[1]  Jon Andoni Duñabeitia,et al.  Speech perception in bilingual contexts: Neuropsychological impact of mixing languages at the inter-sentential level , 2019, Journal of Neurolinguistics.

[2]  F. Craik,et al.  Cognition through the lifespan: mechanisms of change , 2006, Trends in Cognitive Sciences.

[3]  Virginie van Wassenhove,et al.  Distinct contributions of low- and high-frequency neural oscillations to speech comprehension , 2017 .

[4]  Alexis Hervais-Adelman,et al.  Bilingual speech-in-noise: Neural bases of semantic context use in the native language , 2014, Brain and Language.

[5]  Matthew H. Davis,et al.  Phase Entrainment of Brain Oscillations Causally Modulates Neural Responses to Intelligible Speech , 2018, Current Biology.

[6]  Paul E. Engelhardt,et al.  Good enough language processing: A satisficing approach , 2009 .

[7]  Sophie K. Scott,et al.  Semantic versus perceptual interactions in neural processing of speech-in-noise , 2013, NeuroImage.

[8]  G. Buzsáki,et al.  Neuronal Oscillations in Cortical Networks , 2004, Science.

[9]  Tobias Reichenbach,et al.  Neural Speech Tracking in the Theta and in the Delta Frequency Band Differentially Encode Clarity and Comprehension of Speech in Noise , 2019, The Journal of Neuroscience.

[10]  R V Shannon,et al.  Speech Recognition with Primarily Temporal Cues , 1995, Science.

[11]  Oded Ghitza,et al.  On the Role of Theta-Driven Syllabic Parsing in Decoding Speech: Intelligibility of Speech with a Manipulated Modulation Spectrum , 2012, Front. Psychology.

[12]  Jon Andoni Duñabeitia,et al.  Differential oscillatory encoding of foreign speech , 2015, Brain and Language.

[13]  Joachim Gross,et al.  Perceptually relevant speech tracking in auditory and motor cortex reflects distinct linguistic features , 2018, PLoS biology.

[14]  A. Friederici,et al.  Processing a second language: Late learners''comprehension mechanisms as revealed by event-related b , 2001 .

[15]  D. Poeppel,et al.  Phase Patterns of Neuronal Responses Reliably Discriminate Speech in Human Auditory Cortex , 2007, Neuron.

[16]  D. Poeppel,et al.  Cortical Tracking of Hierarchical Linguistic Structures in Connected Speech , 2015, Nature Neuroscience.

[17]  Joachim Gross,et al.  Phase-Locked Responses to Speech in Human Auditory Cortex are Enhanced During Comprehension , 2012, Cerebral cortex.

[18]  David Poeppel,et al.  Acoustic landmarks drive delta–theta oscillations to enable speech comprehension by facilitating perceptual parsing , 2014, NeuroImage.

[19]  P. Schyns,et al.  Speech Rhythms and Multiplexed Oscillatory Sensory Coding in the Human Brain , 2013, PLoS biology.

[20]  Robin A. A. Ince,et al.  Frontal Top-Down Signals Increase Coupling of Auditory Low-Frequency Oscillations to Continuous Speech in Human Listeners , 2015, Current Biology.

[21]  G. Karmos,et al.  Entrainment of Neuronal Oscillations as a Mechanism of Attentional Selection , 2008, Science.