Native language status of the listener modulates the neural integration of speech and iconic gestures in clear and adverse listening conditions

&NA; Native listeners neurally integrate iconic gestures with speech, which can enhance degraded speech comprehension. However, it is unknown how non‐native listeners neurally integrate speech and gestures, as they might process visual semantic context differently than natives. We recorded EEG while native and highly‐proficient non‐native listeners watched videos of an actress uttering an action verb in clear or degraded speech, accompanied by a matching ('to drive'+driving gesture) or mismatching gesture ('to drink'+mixing gesture). Degraded speech elicited an enhanced N400 amplitude compared to clear speech in both groups, revealing an increase in neural resources needed to resolve the spoken input. A larger N400 effect was found in clear speech for non‐natives compared to natives, but in degraded speech only for natives. Non‐native listeners might thus process gesture more strongly than natives when speech is clear, but need more auditory cues to facilitate access to gestural semantic information when speech is degraded.

[1]  Anjali Raja Beharelle,et al.  Frontal and temporal contributions to understanding the iconic co‐speech gestures that accompany speech , 2014, Human brain mapping.

[2]  Heather Shovelton,et al.  An experimental investigation of some properties of individual iconic gestures that mediate their communicative power. , 2002, British journal of psychology.

[3]  D. Hardison,et al.  The Role of Gestures and Facial Cues in Second Language Listening Comprehension , 2005 .

[4]  R. Oostenveld,et al.  Nonparametric statistical testing of EEG- and MEG-data , 2007, Journal of Neuroscience Methods.

[5]  Roel M. Willems,et al.  Differential roles for left inferior frontal and superior temporal cortex in multimodal integration of action and language , 2009, NeuroImage.

[6]  Zeshu Shao,et al.  The Role of Synchrony and Ambiguity in Speech–Gesture Integration during Comprehension , 2011, Journal of Cognitive Neuroscience.

[7]  Salvador Soto-Faraco,et al.  Synchronization by the hand: the sight of gestures modulates low-frequency activity in brain responses to continuous speech , 2015, Front. Hum. Neurosci..

[8]  Markus Kiefer,et al.  Right Hemisphere Activation during Indirect Semantic Priming: Evidence from Event-Related Potentials , 1998, Brain and Language.

[9]  Ann R Bradlow,et al.  Semantic and phonetic enhancements for speech-in-noise recognition by native and non-native listeners. , 2007, The Journal of the Acoustical Society of America.

[10]  Paul Boersma,et al.  Praat: doing phonetics by computer , 2003 .

[11]  Salvador Soto-Faraco,et al.  Speaker's Hand Gestures Modulate Speech Perception through Phase Resetting 1 of Ongoing Neural Oscillations 2 3 , 2022 .

[12]  Linda Drijvers,et al.  Hearing and seeing meaning in noise: Alpha, beta, and gamma oscillations predict gestural enhancement of degraded speech comprehension , 2018, Human brain mapping.

[13]  Kristin Lemhöfer,et al.  Introducing LexTALE: A quick and valid Lexical Test for Advanced Learners of English , 2011, Behavior research methods.

[14]  Tilo Kircher,et al.  A Supramodal Neural Network for Speech and Gesture Semantics: An fMRI Study , 2012, PloS one.

[15]  Kara D. Federmeier,et al.  Electrophysiology reveals semantic memory use in language comprehension , 2000, Trends in Cognitive Sciences.

[16]  Marianne Gullberg,et al.  Acquiring L 2 sentence comprehension : A longitudinal study of word monitoring in noise ∗ , 2012 .

[17]  Lin Wang,et al.  Beat that Word: How Listeners Integrate Beat Gesture and Focus in Multimodal Speech Discourse , 2016, Journal of Cognitive Neuroscience.

[18]  Linda Drijvers,et al.  Visual Context Enhanced: The Joint Contribution of Iconic Gestures and Visible Speech to Degraded Speech Comprehension. , 2017, Journal of speech, language, and hearing research : JSLHR.

[19]  Fanny Meunier,et al.  Interplay between acoustic/phonetic and semantic processes during spoken sentence comprehension: An ERP study , 2011, Brain and Language.

[20]  Andrew Faulkner,et al.  The use of visual cues in the perception of non-native consonant contrasts. , 2006, The Journal of the Acoustical Society of America.

[21]  Max Planck,et al.  Acquiring L2 sentence comprehension: A longitudinal study of word monitoring , 2012 .

[22]  T. Kircher,et al.  The EEG and fMRI signatures of neural integration: An investigation of meaningful gestures and corresponding speech , 2015, Neuropsychologia.

[23]  GEOFFREY BEATTIE,et al.  Do iconic hand gestures really contribute anything to the semantic information conveyed by speech? An experimental investigation , 1999 .

[24]  T. Houtgast,et al.  Quantifying the intelligibility of speech in noise for non-native talkers. , 2002, The Journal of the Acoustical Society of America.

[25]  Seana Coulson,et al.  Iconic gestures prime related concepts: An ERP study , 2007, Psychonomic bulletin & review.

[26]  Sonja A. Kotz,et al.  Narrowed Expectancies under Degraded Speech: Revisiting the N400 , 2013, Journal of Cognitive Neuroscience.

[27]  R. Keith,et al.  An effect of linguistic experience. Auditory word discrimination by native and non-native speakers of English. , 1978, Audiology : official organ of the International Society of Audiology.

[28]  Sonja A. Kotz,et al.  Multiple brain signatures of integration in the comprehension of degraded speech , 2011, NeuroImage.

[29]  G. Beattie,et al.  Do Iconic Hand Gestures Really Contribute to the Communication of Semantic Information in a Face-to-Face Context? , 2009 .

[30]  Frederic Dick,et al.  Effects of acoustic distortion and semantic context on event-related potentials to spoken words. , 2006, Psychophysiology.

[31]  T. Dahl,et al.  How I See What You're Saying: The Role of Gestures in Native and Foreign Language Listening Comprehension , 2014 .

[32]  Ying Choon Wu,et al.  Meaningful gestures: electrophysiological indices of iconic gesture comprehension. , 2005, Psychophysiology.

[33]  Ivan Toni,et al.  Eye'm talking to you: speakers' gaze direction modulates co-speech gesture processing in the right MTG. , 2015, Social cognitive and affective neuroscience.

[34]  Kara D. Federmeier,et al.  Thirty years and counting: finding meaning in the N400 component of the event-related brain potential (ERP). , 2011, Annual review of psychology.

[35]  Sotaro Kita,et al.  On-line Integration of Semantic Information from Speech and Gesture: Insights from Event-related Brain Potentials , 2007, Journal of Cognitive Neuroscience.

[36]  Judith Holler,et al.  The processing of speech, gesture, and action during language comprehension , 2015, Psychonomic bulletin & review.

[37]  Peter Hagoort,et al.  When gestures catch the eye: The influence of gaze direction on co-speech gesture comprehension in triadic communication , 2012, CogSci.

[38]  Stuart Rosen,et al.  Native-language benefit for understanding speech-in-noise: The contribution of semantics* , 2009, Bilingualism: Language and Cognition.

[39]  Aslı Özyürek,et al.  Hearing and seeing meaning in speech and gesture: insights from brain and behaviour , 2014, Philosophical Transactions of the Royal Society B: Biological Sciences.

[40]  Hidetoshi Yamagishi,et al.  Effect of background noise on perception of English speech for Japanese listeners. , 2002, Auris, nasus, larynx.

[41]  R. Krauss,et al.  Do conversational hand gestures communicate? , 1991, Journal of personality and social psychology.

[42]  Yang Zhang,et al.  Effects of Semantic Context and Fundamental Frequency Contours on Mandarin Speech Recognition by Second Language Learners , 2016, Front. Psychol..

[43]  Jeremy I. Skipper,et al.  Action to Language via the Mirror Neuron System: Lending a helping hand to hearing: another motor theory of speech perception , 2006 .

[44]  Jeremy I. Skipper,et al.  Seeing Voices : How Cortical Areas Supporting Speech Production Mediate Audiovisual Speech Perception , 2007 .

[45]  C GunterThomas,et al.  The Role of Iconic Gestures in Speech Disambiguation , 2007 .

[46]  Andreas Hennenlotter,et al.  Neural correlates of the processing of co-speech gestures , 2008, NeuroImage.

[47]  T. Trabasso,et al.  Offering a Hand to Pragmatic Understanding: The Role of Speech and Gesture in Comprehension and Memory , 1999 .

[48]  R V Shannon,et al.  Speech Recognition with Primarily Temporal Cues , 1995, Science.

[49]  A. Jansen,et al.  Neural integration of iconic and unrelated coverbal gestures: A functional MRI study , 2009, Human brain mapping.

[50]  Laura L. Namy,et al.  Developmental changes in neural activity to familiar words and gestures , 2007, Brain and Language.

[51]  Jonas Obleser,et al.  Integration of iconic gestures and speech in left superior temporal areas boosts speech comprehension under adverse listening conditions , 2010, NeuroImage.

[52]  T. Houtgast,et al.  Quantifying the intelligibility of speech in noise for non-native listeners. , 2002, The Journal of the Acoustical Society of America.

[53]  S. Goldin-Meadow,et al.  Hearing Gesture: How Our Hands Help Us Think , 2003 .

[54]  Jeremy I. Skipper,et al.  Gestures Orchestrate Brain Networks for Language Understanding , 2009, Current Biology.

[55]  M. E. Dikmans,et al.  Advanced second language learners experience difficulties processing reduced word pronunciation variants , 2017 .

[56]  Roel M. Willems,et al.  When language meets action: the neural integration of gesture and speech. , 2007, Cerebral cortex.

[57]  Thomas C. Gunter,et al.  What Iconic Gesture Fragments Reveal about Gesture–Speech Integration: When Synchrony Is Lost, Memory Can Help , 2011, Journal of Cognitive Neuroscience.

[58]  Tessa Bent,et al.  The clear speech effect for non-native listeners. , 2002, The Journal of the Acoustical Society of America.

[59]  Salvador Soto-Faraco,et al.  Beat gestures modulate auditory integration in speech perception , 2013, Brain and Language.

[60]  Robert Oostenveld,et al.  FieldTrip: Open Source Software for Advanced Analysis of MEG, EEG, and Invasive Electrophysiological Data , 2010, Comput. Intell. Neurosci..

[61]  J. Connolly,et al.  Event-related potential sensitivity to acoustic and semantic properties of terminal words in sentences , 1992, Brain and Language.

[62]  Franco Simonetti,et al.  Gesture and metaphor comprehension: Electrophysiological evidence of cross-modal coordination by audiovisual stimulation , 2009, Brain and Cognition.

[63]  S. Kelly,et al.  Neural correlates of bimodal speech and gesture comprehension , 2004, Brain and Language.

[64]  Heather Shovelton,et al.  Mapping the Range of Information Contained in the Iconic Hand Gestures that Accompany Spontaneous Speech , 1999 .

[65]  James Bartolotti,et al.  An intentional stance modulates the integration of gesture and speech during comprehension , 2007, Brain and Language.

[66]  M. Studdert-Kennedy Hand and Mind: What Gestures Reveal About Thought. , 1994 .

[67]  Angela D. Friederici,et al.  Gesture Facilitates the Syntactic Analysis of Speech , 2012, Front. Psychology.

[68]  Thomas C. Gunter,et al.  The Role of Iconic Gestures in Speech Disambiguation: ERP Evidence , 2007, Journal of Cognitive Neuroscience.

[69]  Spencer D. Kelly,et al.  When actions speak too much louder than words: Hand gestures disrupt word learning when phonetic demands are high , 2012 .

[70]  James Bartolotti,et al.  Integrating Speech and Iconic Gestures in a Stroop-like Task: Evidence for Automatic Processing , 2010, Journal of Cognitive Neuroscience.

[71]  S Buus,et al.  Age of second-language acquisition and perception of speech in noise. , 1997, Journal of speech, language, and hearing research : JSLHR.

[72]  Lin Wang,et al.  The role of beat gesture and pitch accent in semantic processing: An ERP study , 2013, Neuropsychologia.

[73]  Thomas C. Gunter,et al.  The benefit of gestures during communication: Evidence from hearing and hearing-impaired individuals , 2012, Cortex.

[74]  Ying Choon Wu,et al.  How iconic gestures enhance communication: An ERP study , 2007, Brain and Language.

[75]  Giosuè Baggio,et al.  The balance between memory and unification in semantics: A dynamic account of the N400 , 2011 .