Mutual Visibility and Information Structure Enhance Synchrony between Speech and Co-Speech Movements

Our study aims at gaining a better understanding of how speech-gesture synchronization is affected by the factors (1) mutual visibility and (2) linguistic information structure. To this end, we analyzed spontaneous dyadic interactions where interlocutors are engaged in a verbalized version of the game TicTacToe, both with and without mutual visibility. The setting allows for a straightforward differentiation of contextually given and informative game moves, which are studied with respect to their manual and linguistic realization. Speech and corresponding manual game moves are synchronized more often when there is mutual visibility and when game moves are informative. Mutual visibility leads to a slight precedence of manual moves over corresponding verbalizations, and to a tighter temporal alignment of speech and co-speech movements. Informative moves counter the movement precedence effect, thus allowing co-speech movement targets to smoothly synchronize with prosodic boundaries.

[1]  Guest Editorial Gesture and speech in interaction : An overview , 2013 .

[2]  Norma C Mendoza-Denton,et al.  Structuring Information through Gesture and Intonation , 2005 .

[3]  Paul Boersma,et al.  Praat: doing phonetics by computer , 2003 .

[4]  Carolin Kirchhof,et al.  The shrink point: audiovisual integration of speech-gesture synchrony , 2017 .

[5]  D. Loehr,et al.  Temporal, structural, and pragmatic synchrony between intonation and gesture , 2012 .

[6]  Jan Peter de Ruiter,et al.  The Interplay Between Gesture and Speech in the Production of Referring Expressions: Investigating the Tradeoff Hypothesis , 2012, Top. Cogn. Sci..

[7]  Emiel Krahmer,et al.  Seeing and Being Seen: The Effects on Gesture Production , 2011, J. Comput. Mediat. Commun..

[8]  Emiel Krahmer,et al.  Reduction in gesture during the production of repeated references , 2015 .

[9]  Janet Beavin Bavelas,et al.  Gesturing on the telephone: Independent effects of dialogue and visibility. , 2008 .

[10]  D. McNeill Hand and Mind: What Gestures Reveal about Thought , 1992 .

[11]  Sotaro Kita,et al.  On-line Integration of Semantic Information from Speech and Gesture: Insights from Event-related Brain Potentials , 2007, Journal of Cognitive Neuroscience.

[12]  R Core Team,et al.  R: A language and environment for statistical computing. , 2014 .

[13]  Mark K. Tiede,et al.  A Kinematic Study of Prosodic Structure in Articulatory and Manual Gestures: Results from a Novel Method of Data Collection , 2017, Laboratory phonology.

[14]  M. Alibali,et al.  Effects of Visibility between Speaker and Listener on Gesture Production: Some Gestures Are Meant to Be Seen , 2001 .

[15]  Fred Cummins,et al.  The temporal relation between beat gestures and speech , 2011 .

[16]  Núria Esteve-Gibert,et al.  Prosodic structure shapes the temporal realization of intonation and manual gesture movements. , 2013, Journal of speech, language, and hearing research : JSLHR.

[17]  Hennie Brugman,et al.  Annotating Multi-media/Multi-modal Resources with ELAN , 2004, LREC.