The benefit of gestures during communication: Evidence from hearing and hearing-impaired individuals

There is no doubt that gestures are communicative and can be integrated online with speech. Little is known, however, about the nature of this process, for example, its automaticity and how our own communicative abilities and also our environment influence the integration of gesture and speech. In two Event Related Potential (ERP) experiments, the effects of gestures during speech comprehension were explored. In both experiments, participants performed a shallow task thereby avoiding explicit gesture-speech integration. In the first experiment, participants with normal hearing viewed videos in which a gesturing actress uttered sentences which were either embedded in multi-speaker babble noise or not. The sentences contained a homonym which was disambiguated by the information in a gesture, which was presented asynchronous to speech (1000 msec earlier). Downstream, the sentence contained a target word that was either related to the dominant or subordinate meaning of the homonym and was used to indicate the success of the disambiguation. Both the homonym and the target word position showed clear ERP evidence of gesture-speech integration and disambiguation only under babble noise. Thus, during noise, gestures were taken into account as an important communicative cue. In Experiment 2, the same asynchronous stimuli were presented to a group of hearing-impaired students and age-matched controls. Only the hearing-impaired individuals showed significant speech-gesture integration and successful disambiguation at the target word. The age-matched controls did not show any effect. Thus, individuals who chronically experience suboptimal communicative situations in daily life automatically take gestures into account. The data from both experiments indicate that gestures are beneficial in countering difficult communication conditions independent of whether the difficulties are due to external (babble noise) or internal (hearing impairment) factors.

[1]  S. Geisser,et al.  On methods in the analysis of profile data , 1959 .

[2]  Matthew W. G. Dye,et al.  Do deaf individuals see better? , 2006, Trends in Cognitive Sciences.

[3]  E. Guglielmelli,et al.  An overview of hearing impairment in older adults: perspectives for rehabilitation with hearing aids. , 2010, European review for medical and pharmacological sciences.

[4]  José A. Hinojosa,et al.  Event-Related Potentials and Semantics: An Overview and an Integrative Proposal , 2001, Brain and Language.

[5]  Wei Ji Ma,et al.  Lip-Reading Aids Word Recognition Most in Moderate Noise: A Bayesian Explanation Using High-Dimensional Feature Space , 2009, PloS one.

[6]  Colin H. Hansen,et al.  ENGINEERING NOISE CONTROL: Theory and Practice , 1988 .

[7]  R. C. Oldfield The assessment and analysis of handedness: the Edinburgh inventory. , 1971, Neuropsychologia.

[8]  G. Simpson Meaning dominance and semantic context in the processing of lexical ambiguity , 1981 .

[9]  G. Kellas,et al.  Strength of Discourse Context as a Determinant of the Subordinate Bias Effect , 1999, The Quarterly journal of experimental psychology. A, Human experimental psychology.

[10]  Martin Meissner,et al.  The Sign Language of Sawmill Workers in British Columbia , 2013 .

[11]  Zeshu Shao,et al.  The Role of Synchrony and Ambiguity in Speech–Gesture Integration during Comprehension , 2011, Journal of Cognitive Neuroscience.

[12]  Kenneth Holmqvist,et al.  What speakers do and what addressees look at: visual attention to gestures in human interaction live and on video , 2006 .

[13]  A. Kendon Gesture: Visible Action as Utterance , 2004 .

[14]  Craig J. Brozinsky,et al.  Impact of Early Deafness and Early Exposure to Sign Language on the Cerebral Organization for Motion Processing , 2001, The Journal of Neuroscience.

[15]  Thomas C. Gunter,et al.  What Iconic Gesture Fragments Reveal about Gesture–Speech Integration: When Synchrony Is Lost, Memory Can Help , 2011, Journal of Cognitive Neuroscience.

[16]  Geoffrey Beattie,et al.  An experimental investigation of the role of different types of iconic gesture in communication: A semantic feature approach , 2003 .

[17]  J. Schwartz,et al.  Seeing to hear better: evidence for early audio-visual interactions in speech identification , 2004, Cognition.

[18]  O. Andreassen,et al.  Mice Deficient in Cellular Glutathione Peroxidase Show Increased Vulnerability to Malonate, 3-Nitropropionic Acid, and 1-Methyl-4-Phenyl-1,2,5,6-Tetrahydropyridine , 2000, The Journal of Neuroscience.

[19]  Robin L. Thompson,et al.  Eye gaze during comprehension of American Sign Language by native and beginning signers. , 2009, Journal of deaf studies and deaf education.

[20]  C. Spence,et al.  Multisensory perception: Beyond modularity and convergence , 2000, Current Biology.

[21]  Thomas C. Gunter,et al.  The Role of Iconic Gestures in Speech Disambiguation: ERP Evidence , 2007, Journal of Cognitive Neuroscience.

[22]  P. Paul,et al.  Comprehension of High-Frequency Multimeaning Words by Students with Hearing Impairment , 1991 .

[23]  John J. Foxe,et al.  Do you see what I am saying? Exploring visual enhancement of speech comprehension in noisy environments. , 2006, Cerebral cortex.

[24]  Robert E. Johnson An Extension of Oregon Sawmill Sign Language , 1977, Current Anthropology.

[25]  R. Herman,et al.  The communication, speech and gesture of a group of hearing-impaired children. , 2001, International journal of language & communication disorders.

[26]  Angela D. Friederici,et al.  Working Memory and Lexical Ambiguity Resolution as Revealed by ERPs: A Difficult Case for Activation Theories , 2003, Journal of Cognitive Neuroscience.

[27]  Uri Hadar,et al.  The Semantic Specificity of Gesture , 2004 .

[28]  Jonas Obleser,et al.  Integration of iconic gestures and speech in left superior temporal areas boosts speech comprehension under adverse listening conditions , 2010, NeuroImage.

[29]  Peter V. Paul,et al.  Multimeaning Words and Reading Comprehension , 1988 .

[30]  D. McNeill Hand and Mind: What Gestures Reveal about Thought , 1992 .

[31]  R. Krauss,et al.  Do conversational hand gestures communicate? , 1991, Journal of personality and social psychology.

[32]  Pierre Feyereisen,et al.  Further investigation on the mnemonic effect of gestures: Their meaning matters , 2006 .

[33]  Nishan Canagarajah,et al.  Perceptually optimised sign language video coding based on eye tracking analysis , 2003 .

[34]  K. Grant,et al.  Integration efficiency for speech perception within and across sensory modalities by normal-hearing and hearing-impaired individuals. , 2007, The Journal of the Acoustical Society of America.

[35]  Rainer Goebel,et al.  Top–down task effects overrule automatic multisensory responses to letter–sound pairs in auditory association cortex , 2006, NeuroImage.

[36]  W. Rogers,et al.  THE CONTRIBUTION OF KINESIC ILLUSTRATORS TOWARD THE COMPREHENSION OF VERBAL BEHAVIOR WITHIN UTTERANCES , 1978 .

[37]  Kawai Chui,et al.  Temporal patterning of speech and iconic gestures in conversational discourse , 2005 .

[38]  T. Trabasso,et al.  Offering a Hand to Pragmatic Understanding: The Role of Speech and Gesture in Comprehension and Memory , 1999 .

[39]  P. Holcomb Semantic priming and stimulus degradation: implications for the role of the N400 in language processing. , 2007, Psychophysiology.

[40]  Justine Cassell,et al.  Communicative Effects of Speech-Mismatched Gestures , 1994 .

[41]  Deborah S. Culbertson,et al.  Language and Speech of the Deaf and Hard of Hearing , 2012 .

[42]  Laura J. Muir,et al.  Perception of sign language and its application to visual communications for deaf people. , 2005, Journal of deaf studies and deaf education.

[43]  J G Clark,et al.  Uses and abuses of hearing loss classification. , 1981, ASHA.

[44]  Matthew W. G. Dye,et al.  Is Visual Selective Attention in Deaf Individuals Enhanced or Deficient? The Case of the Useful Field of View , 2009, PloS one.

[45]  D. McNeill Gesture and Thought , 2005 .

[46]  E. Maris,et al.  Two Sides of the Same Coin , 2010, Psychological science.

[47]  D. Bavelier,et al.  Changes in the Spatial Distribution of Visual Attention after Early Deafness , 2002, Journal of Cognitive Neuroscience.

[48]  Ronald L. Schow,et al.  Introduction to Audiologic Rehabilitation , 2001 .

[49]  C. Hutton,et al.  Visual Attention to the Periphery Is Enhanced in Congenitally Deaf Individuals , 2000, The Journal of Neuroscience.

[50]  E. Schegloff Structures of Social Action: On some gestures' relation to talk , 1985 .

[51]  J. M. Atkinson Structures of Social Action: Contents , 1985 .

[52]  L. Bernstein,et al.  Speech perception without hearing , 2000, Perception & psychophysics.

[53]  B. Moore An Introduction to the Psychology of Hearing , 1977 .

[54]  Heather Shovelton,et al.  Mapping the Range of Information Contained in the Iconic Hand Gestures that Accompany Spontaneous Speech , 1999 .

[55]  James Bartolotti,et al.  An intentional stance modulates the integration of gesture and speech during comprehension , 2007, Brain and Language.