Movements and Holds in Fluent Sentence Production of American Sign Language: The Action-Based Approach

The importance of bodily movements in the production and perception of communicative actions has been shown for the spoken language modality and accounted for by a theory of communicative actions (Cogn. Process. 2010;11:187–205). In this study, the theory of communicative actions was adapted to the sign language modality; we tested the hypothesis that in the fluent production of short sign language sentences, strong-hand manual sign actions are continuously ongoing without holds, while co-manual oral expression actions (i.e. sign-related actions of the lips, jaw, and tip of the tongue) and co-manual facial expression actions (i.e. actions of the eyebrows, eyelids, etc.), as well as weak-hand actions, show considerable holds. An American Sign Language (ASL) corpus of 100 sentences was analyzed by visually inspecting each frame-to-frame difference (30 frames/s) for separating movement and hold phases for each manual, oral, and facial action. Excluding fingerspelling and signs in sentence-final position, no manual holds were found for the strong hand (0%; the weak hand is not considered), while oral holds occurred in 22% of all oral expression actions and facial holds occurred for all facial expression actions analyzed (100%). These results support the idea that in each language modality, the dominant articulatory system (vocal tract or manual system) determines the timing of actions. In signed languages, in which manual actions are dominant, holds occur mainly in co-manual oral and co-manual facial actions. Conversely, in spoken language, vocal tract actions (i.e. actions of the lips, tongue, jaw, velum, and vocal folds) are dominant; holds occur primarily in co-verbal manual and co-verbal facial actions.

[1]  Bernd J. Kröger,et al.  Towards a neurocomputational model of speech production and perception , 2009, Speech Commun..

[2]  J. Cohn,et al.  Movement Differences between Deliberate and Spontaneous Facial Expressions: Zygomaticus Major Action in Smiling , 2006, Journal of nonverbal behavior.

[3]  Scott K. Liddell Grammar, Gesture, and Meaning in American Sign Language , 2003 .

[4]  Jeffrey F. Cohn,et al.  Foundations of human computing: facial expression and emotion , 2006, ICMI '06.

[5]  Peter Birkholz,et al.  Articulatory Synthesis of Speech and Singing: State of the Art and Suggestions for Future Research , 2009, COST 2102 School.

[6]  M. Studdert-Kennedy Hand and Mind: What Gestures Reveal About Thought. , 1994 .

[7]  A. Liberman,et al.  The motor theory of speech perception revised , 1985, Cognition.

[8]  Sabina Fontana Mouth actions as gesture in sign language , 2008 .

[9]  Stefan Kopp,et al.  A model for production, perception, and acquisition of actions in face-to-face communication , 2010, Cognitive Processing.

[10]  W. Stokoe,et al.  Sign language structure: an outline of the visual communication systems of the American deaf. 1960. , 1961, Journal of deaf studies and deaf education.

[11]  Karen L. Schmidt,et al.  Comparison of Deliberate and Spontaneous Facial Movement in Smiles and Eyebrow Raises , 2009, Journal of nonverbal behavior.

[12]  Dani Byrd,et al.  Dynamic action units slip in speech production errors , 2007, Cognition.

[13]  Scott K. Liddell,et al.  American Sign Language: The Phonological Base , 2013 .

[14]  D. McNeill Gesture and Thought , 2005 .

[15]  Hedda Lausberg,et al.  Methods in Gesture Research: , 2009 .

[16]  W. Sandler Symbiotic symbolization by hand and mouth in sign language , 2009, Semiotica.

[17]  J. Cohn,et al.  Deciphering the Enigmatic Face , 2005, Psychological science.

[18]  C. Browman,et al.  Articulatory Phonology: An Overview , 1992, Phonetica.

[19]  D. McNeill Language and Gesture , 2000 .

[20]  Jeffrey F. Cohn,et al.  Observer-based measurement of facial expression with the Facial Action Coding System. , 2007 .

[21]  John J. B. Allen,et al.  The handbook of emotion elicitation and assessment , 2007 .

[22]  Anna Esposito,et al.  Multimodal Signals: Cognitive and Algorithmic Issues, COST Action 2102 and euCognition International School Vietri sul Mare, Italy, April 21-26, 2008, Revised Selected and Invited Papers , 2009, COST 2102 School.

[23]  Peter Birkholz,et al.  A Gesture-Based Concept for Speech Movement Control in Articulatory Speech Synthesis , 2007, COST 2102 Workshop.

[24]  P. Ekman Facial expression and emotion. , 1993, The American psychologist.

[25]  Sherman Wilcox,et al.  Empirical methods in signed language research , 2007 .

[26]  Karen L. Schmidt,et al.  Signal characteristics of spontaneous facial expressions: automatic movement in solitary and social smiles , 2003, Biological Psychology.

[27]  Adam Kendon,et al.  Language and Gesture: Language and gesture: unity or duality? , 2000 .

[28]  Dani Byrd,et al.  Action to Language via the Mirror Neuron System: The role of vocal tract gestural action units in understanding the evolution of phonology , 2006 .

[29]  Monica Gonzalez-Marquez,et al.  Methods in cognitive linguistics , 2007 .

[30]  W. Stokoe Sign language structure: an outline of the visual communication systems of the American deaf. 1960. , 1961, Journal of deaf studies and deaf education.

[31]  Anil K. Jain,et al.  Handbook of Face Recognition, 2nd Edition , 2011 .

[32]  Takeo Kanade,et al.  Facial Expression Analysis , 2011, AMFG.

[33]  Melanie Metzger,et al.  Gesture in sign language discourse , 1998 .

[34]  P. Ekman,et al.  Measuring facial movement , 1976 .

[35]  Stefan Kopp,et al.  Synthesizing multimodal utterances for conversational agents , 2004, Comput. Animat. Virtual Worlds.

[36]  David M. Perlmutter SONORITY AND SYLLABLE STRUCTURE IN AMERICAN SIGN LANGUAGE , 1993 .

[37]  K. Emmorey Language, Cognition, and the Brain: Insights From Sign Language Research , 2001 .

[38]  M. Arbib Action to language via the mirror neuron system , 2006 .

[39]  Bernd J. Kröger,et al.  Articulatory Speech Re-synthesis: Profiting from Natural Acoustic Speech Data , 2009, COST 2102 Conference.

[40]  Louis Goldstein,et al.  Articulatory gestures as phonological units , 1989, Phonology.

[41]  E. Klima The signs of language , 1979 .

[42]  Hermann Ney,et al.  Speech recognition techniques for a sign language recognition system , 2007, INTERSPEECH.

[43]  Francis K. H. Quek,et al.  Catchments, prosody and discourse , 2001 .

[44]  Ceil Lucas,et al.  Linguistics of American Sign Language: An Introduction , 1995 .

[45]  Fernando De la Torre,et al.  Facial Expression Analysis , 2011, Visual Analysis of Humans.

[46]  Dani Byrd,et al.  Task-dynamics of gestural timing: Phase windows and multifrequency rhythms , 2000 .

[47]  A. Kendon Gesture: Visible Action as Utterance , 2004 .