Evidence for children’s online integration of simultaneous information from speech and iconic gestures: an ERP study

ABSTRACT Children perceive iconic gestures, along with speech they hear. Previous studies have shown that children integrate information from both modalities. Yet it is not known whether children can integrate both types of information simultaneously as soon as they are available (as adults do) or whether they initially process them separately and integrate them later. Using electrophysiological measures, we examined the online neurocognitive processing of gesture-speech integration in 6- to 7-year-old children. We focused on the N400 event-related potential component which is modulated by semantic integration load. Children watched video clips of matching or mismatching gesture-speech combinations, which varied the semantic integration load. The ERPs showed that the amplitude of the N400 was larger in the mismatching condition than in the matching condition. This finding provides the first neural evidence that by the ages of 6 or 7, children integrate multimodal semantic information in an online fashion comparable to that of adults.

[1]  R. Oostenveld,et al.  Nonparametric statistical testing of EEG- and MEG-data , 2007, Journal of Neuroscience Methods.

[2]  Peter Hagoort,et al.  Social eye gaze modulates processing of speech and co-speech gesture , 2014, Cognition.

[3]  Martha W. Alibali,et al.  The Role of Gesture in Children's Comprehension of Spoken Language:Now They Need It, Now They Don't , 2000 .

[4]  Kara D. Federmeier,et al.  Electrophysiology reveals semantic memory use in language comprehension , 2000, Trends in Cognitive Sciences.

[5]  L. Verhoeven,et al.  Semantic Processing of Sentences in Preschoolers With Specific Language Impairment: Evidence From the N400 Effect. , 2017, Journal of speech, language, and hearing research : JSLHR.

[6]  Paula J. Clarke,et al.  The N400 effect in children: Relationships with comprehension, vocabulary and decoding , 2011, Brain and Language.

[7]  Thomas C. Gunter,et al.  What Iconic Gesture Fragments Reveal about Gesture–Speech Integration: When Synchrony Is Lost, Memory Can Help , 2011, Journal of Cognitive Neuroscience.

[8]  A. Friederici,et al.  N400-like Semantic Incongruity Effect in 19-Month-Olds: Processing Known Words in Picture Contexts , 2004, Journal of Cognitive Neuroscience.

[9]  Hedda Lausberg,et al.  Methods in Gesture Research: , 2009 .

[10]  H. Neville,et al.  Visual and auditory sentence processing: A developmental analysis using event‐related brain potentials , 1992 .

[11]  Kara D. Federmeier,et al.  Thirty years and counting: finding meaning in the N400 component of the event-related brain potential (ERP). , 2011, Annual review of psychology.

[12]  E. Maris,et al.  Two Sides of the Same Coin , 2010, Psychological science.

[13]  Zeshu Shao,et al.  The Role of Synchrony and Ambiguity in Speech–Gesture Integration during Comprehension , 2011, Journal of Cognitive Neuroscience.

[14]  S. Kelly,et al.  Neural correlates of bimodal speech and gesture comprehension , 2004, Brain and Language.

[15]  G. Beattie,et al.  Do Iconic Hand Gestures Really Contribute to the Communication of Semantic Information in a Face-to-Face Context? , 2009 .

[16]  R. B. Church,et al.  A comparison between children's and adults' ability to detect conceptual information conveyed through representational gestures. , 1998, Child development.

[17]  Thomas C. Gunter,et al.  The Role of Iconic Gestures in Speech Disambiguation: ERP Evidence , 2007, Journal of Cognitive Neuroscience.

[18]  Michael Ramscar,et al.  Developmental change and the nature of learning in childhood , 2007, Trends in Cognitive Sciences.

[19]  Angela D. Friederici,et al.  Maturing brain mechanisms and developing behavioral language skills , 2010, Brain and Language.

[20]  Asli Ozyurek,et al.  Iconicity as a communicative strategy: Recipient design in multimodal demonstrations for adults and children , 2013 .

[21]  Salomi S. Asaridou,et al.  Functional neuroanatomy of gesture-speech integration in children varies with individual differences in gesture processing. , 2018, Developmental science.

[22]  D. Slobin Before the beginning: the development of tools of the trade , 2014, Journal of Child Language.

[23]  GEOFFREY BEATTIE,et al.  Do iconic hand gestures really contribute anything to the semantic information conveyed by speech? An experimental investigation , 1999 .

[24]  Susan Goldin-Meadow,et al.  Gesture in the developing brain. , 2012, Developmental science.

[25]  Ying Choon Wu,et al.  How iconic gestures enhance communication: An ERP study , 2007, Brain and Language.

[26]  Sotaro Kita,et al.  On-line Integration of Semantic Information from Speech and Gesture: Insights from Event-related Brain Potentials , 2007, Journal of Cognitive Neuroscience.

[27]  M. Kutas,et al.  Brain potentials during reading reflect word expectancy and semantic association , 1984, Nature.

[28]  J. Turnure,et al.  Mothers production of hand gestures while communicating with their preschool children under various task conditions , 1979 .

[29]  Linda Drijvers,et al.  Native language status of the listener modulates the neural integration of speech and iconic gestures in clear and adverse listening conditions , 2018, Brain and Language.

[30]  S. Nobe Language and Gesture: Where do most spontaneous representational gestures actually occur with respect to speech? , 2000 .

[31]  Susan Goldin-Meadow,et al.  Truth Is at Hand , 2010, Psychological science.

[32]  S. Kita,et al.  The Development of the Ability to Semantically Integrate Information in Speech and Iconic Gesture in Comprehension , 2015, Cogn. Sci..

[33]  Asli Ozyurek Hearing and seeing meaning in speech and gesture: insights from brain and behaviour , 2014 .

[34]  Linda Drijvers,et al.  Visual Context Enhanced: The Joint Contribution of Iconic Gestures and Visible Speech to Degraded Speech Comprehension. , 2017, Journal of speech, language, and hearing research : JSLHR.

[35]  Kawai Chui,et al.  Temporal patterning of speech and iconic gestures in conversational discourse , 2005 .

[36]  T. Trabasso,et al.  Offering a Hand to Pragmatic Understanding: The Role of Speech and Gesture in Comprehension and Memory , 1999 .

[37]  S. Goldin-Meadow,et al.  Chapter 12. Is there an iconic gesture spurt at 26 months , 2011 .

[38]  R. C. Oldfield The assessment and analysis of handedness: the Edinburgh inventory. , 1971, Neuropsychologia.

[39]  Peter Hagoort,et al.  Beyond the sentence given , 2007, Philosophical Transactions of the Royal Society B: Biological Sciences.

[40]  Heather Shovelton,et al.  Mapping the Range of Information Contained in the Iconic Hand Gestures that Accompany Spontaneous Speech , 1999 .

[41]  Robert Oostenveld,et al.  FieldTrip: Open Source Software for Advanced Analysis of MEG, EEG, and Invasive Electrophysiological Data , 2010, Comput. Intell. Neurosci..

[42]  Erik M. Benau,et al.  Semantic Processing in Children and Adults: Incongruity and the N400 , 2011, Journal of psycholinguistic research.

[43]  Sotaro Kita,et al.  What does cross-linguistic variation in semantic coordination of speech and gesture reveal? Evidence for an interface representation of spatial thinking and speaking , 2003 .