Gesture Facilitates the Syntactic Analysis of Speech

Recent research suggests that the brain routinely binds together information from gesture and speech. However, most of this research focused on the integration of representational gestures with the semantic content of speech. Much less is known about how other aspects of gesture, such as emphasis, influence the interpretation of the syntactic relations in a spoken message. Here, we investigated whether beat gestures alter which syntactic structure is assigned to ambiguous spoken German sentences. The P600 component of the Event Related Brain Potential indicated that the more complex syntactic structure is easier to process when the speaker emphasizes the subject of a sentence with a beat. Thus, a simple flick of the hand can change our interpretation of who has been doing what to whom in a spoken sentence. We conclude that gestures and speech are integrated systems. Unlike previous studies, which have shown that the brain effortlessly integrates semantic information from gesture and speech, our study is the first to demonstrate that this integration also occurs for syntactic information. Moreover, the effect appears to be gesture-specific and was not found for other stimuli that draw attention to certain parts of speech, including prosodic emphasis, or a moving visual stimulus with the same trajectory as the gesture. This suggests that only visual emphasis produced with a communicative intention in mind (that is, beat gestures) influences language comprehension, but not a simple visual movement lacking such an intention.

[1]  A. Friederici,et al.  The status of subject-object reanalyses in the language comprehension architecture , 2008 .

[2]  Andreas Hennenlotter,et al.  Neural correlates of the processing of co-speech gestures , 2008, NeuroImage.

[3]  Sotaro Kita,et al.  On-line Integration of Semantic Information from Speech and Gesture: Insights from Event-related Brain Potentials , 2007, Journal of Cognitive Neuroscience.

[4]  Siobhan Chapman Logic and Conversation , 2005 .

[5]  Angela D. Friederici,et al.  The Processing of Locally Ambiguous Relative Clauses in German , 1995 .

[6]  R. Krauss,et al.  Do conversational hand gestures communicate? , 1991, Journal of personality and social psychology.

[7]  Dorothee J. Chwilla,et al.  Monitoring in Language Perception: Mild and Strong Conflicts Elicit Different ERP Patterns , 2010, Journal of Cognitive Neuroscience.

[8]  Willem J. M. Levelt,et al.  Pointing and voicing in deictic expressions , 1985 .

[9]  Karin Harbusch,et al.  An artificial opposition between grammaticality and frequency: comment on Bornkessel, Schlesewsky, and Friederici (2002) , 2003, Cognition.

[10]  Marshall R. Mayberry,et al.  Situated sentence processing: The coordinated interplay account and a neurobehavioral model , 2010, Brain and Language.

[11]  R. Krauss,et al.  The Communicative Value of Conversational Hand Gesture , 1995 .

[12]  R. C. Oldfield The assessment and analysis of handedness: the Edinburgh inventory. , 1971, Neuropsychologia.

[13]  Roel M. Willems,et al.  When language meets action: the neural integration of gesture and speech. , 2007, Cerebral cortex.

[14]  A. Friederici,et al.  Event-related brain potentials during natural speech processing: effects of semantic, morphological and syntactic violations. , 1993, Brain research. Cognitive brain research.

[15]  James Bartolotti,et al.  Integrating Speech and Iconic Gestures in a Stroop-like Task: Evidence for Automatic Processing , 2010, Journal of Cognitive Neuroscience.

[16]  R. Krauss Why Do We Gesture When We Speak? , 1998 .

[17]  Roel M. Willems,et al.  Differential roles for left inferior frontal and superior temporal cortex in multimodal integration of action and language , 2009, NeuroImage.

[18]  A. Braun,et al.  Symbolic gestures and spoken language are processed by a common neural system , 2009, Proceedings of the National Academy of Sciences.

[19]  Thomas F Münte,et al.  Cerebral Cortex Advance Access published July 21, 2007 Visual Scenes Trigger Immediate Syntactic Reanalysis: Evidence from ERPs during Situated Spoken Comprehension , 2022 .

[20]  Markus Bader,et al.  Subject-Object Ambiguities in German Embedded Clauses: An Across-the-Board Comparison , 1999 .

[21]  Jonas Obleser,et al.  Integration of iconic gestures and speech in left superior temporal areas boosts speech comprehension under adverse listening conditions , 2010, NeuroImage.

[22]  Maurizio Gentilucci,et al.  Repetitive Transcranial Magnetic Stimulation of Broca's Area Affects Verbal Responses to Gesture Observation , 2006, Journal of Cognitive Neuroscience.

[23]  P. Holcomb,et al.  Event-related brain potentials elicited by syntactic anomaly , 1992 .

[24]  Sotaro Kita,et al.  What does cross-linguistic variation in semantic coordination of speech and gesture reveal? Evidence for an interface representation of spatial thinking and speaking , 2003 .

[25]  Thomas C. Gunter,et al.  What Iconic Gesture Fragments Reveal about Gesture–Speech Integration: When Synchrony Is Lost, Memory Can Help , 2011, Journal of Cognitive Neuroscience.

[26]  Seana Coulson,et al.  Gestures modulate speech processing early in utterances , 2010, Neuroreport.

[27]  E. Maris,et al.  Two Sides of the Same Coin , 2010, Psychological science.

[28]  Colin M. Brown,et al.  The syntactic positive shift (sps) as an erp measure of syntactic processing , 1993 .

[29]  J. Trueswell,et al.  The role of discourse context in the processing of a flexible word-order language , 2004, Cognition.

[30]  Janet Beavin Bavelas,et al.  Gesturing on the telephone: Independent effects of dialogue and visibility. , 2008 .

[31]  M. Studdert-Kennedy Hand and Mind: What Gestures Reveal About Thought. , 1994 .

[32]  Thomas C. Gunter,et al.  The Role of Iconic Gestures in Speech Disambiguation: ERP Evidence , 2007, Journal of Cognitive Neuroscience.

[33]  A. Friederici Towards a neural basis of auditory sentence processing , 2002, Trends in Cognitive Sciences.

[34]  D. Callan,et al.  Giving speech a hand: Gesture modulates activity in auditory cortex during speech perception , 2009, Human brain mapping.

[35]  Paul Boersma,et al.  Praat, a system for doing phonetics by computer , 2002 .

[36]  M. Swerts,et al.  The Effects of Visual Beats on Prosodic Prominence: Acoustic Analyses, Auditory Perception and Visual Perception. , 2007 .

[37]  M. Alibali,et al.  Effects of Visibility between Speaker and Listener on Gesture Production: Some Gestures Are Meant to Be Seen , 2001 .