Hands That Speak: An Integrated Approach to Studying Complex Human Communicative Body Movements

Gestures, the visible body movements that are ubiquitous in human behavior, are key elements of natural communication. Understanding them is fundamental to designing computing applications with more natural forms of interaction. Both sign languages and everyday gestures reveal the rich signal capacity of this modality. However, although research is developing at fast pace, we still lack in-depth understanding of the elements that create the underlying symbolic signals. This is partly due to lack of tools for studying communicative movements in context. We introduce a novel approach to address this problem based on unobtrusive depth cameras and developed an infrastructure supporting naturalistic data collection. While we focus on sign language and gestures, the tools we developed are applicable for other types of body based research applications. We report on the quality of data collection, and we show how our approach can lead to novel insights and understanding of communicative movements.

[1]  Nadir Weibel,et al.  LAB-IN-A-BOX: semi-automatic tracking of activity in the medical office , 2014, Personal and Ubiquitous Computing.

[2]  Mohammed Waleed Kadous,et al.  Machine Recognition of Auslan Signs Using PowerGloves: Towards Large-Lexicon Recognition of Sign Lan , 1996 .

[3]  Irit Meir,et al.  The emergence of complexity in prosody and syntax. , 2011, Lingua. International review of general linguistics. Revue internationale de linguistique generale.

[4]  U. Bellugi,et al.  A comparison of sign language and spoken language , 1972 .

[5]  Yi Liu,et al.  A three-dimensional gap filling method for large geophysical datasets: Application to global satellite soil moisture observations , 2012, Environ. Model. Softw..

[6]  Nicolas Pugeault,et al.  Spelling it out: Real-time ASL fingerspelling recognition , 2011, 2011 IEEE International Conference on Computer Vision Workshops (ICCV Workshops).

[7]  Raúl Rojas,et al.  Sign Language Recognition Using Kinect , 2012, ICAISC.

[8]  Dimitris N. Metaxas,et al.  Handshapes and Movements: Multiple-Channel American Sign Language Recognition , 2003, Gesture Workshop.

[9]  K. Emmorey Language, Cognition, and the Brain: Insights From Sign Language Research , 2001 .

[10]  Annika Herrmann,et al.  Wendy Sandler and Diane Lillo-Martin, Sign language and linguistic universals , 2008 .

[11]  C. Padden,et al.  The emergence of grammar: systematic structure in a new language. , 2005, Proceedings of the National Academy of Sciences of the United States of America.

[12]  S. Goldin-Meadow,et al.  The role of gesture in communication and thinking , 1999, Trends in Cognitive Sciences.

[13]  Hedda Lausberg,et al.  Methods in Gesture Research: , 2009 .

[14]  Francisco José Madrid-Cuevas,et al.  Depth silhouettes for gesture recognition , 2008, Pattern Recognit. Lett..

[15]  Andrew W. Fitzgibbon,et al.  Real-time human pose recognition in parts from single depth images , 2011, CVPR 2011.

[16]  S. Goldin-Meadow,et al.  The gestures ASL signers use tell us when they are ready to learn math , 2012, Cognition.

[17]  E. Klima The signs of language , 1979 .

[18]  Surendra Ranganath,et al.  Automatic Sign Language Analysis: A Survey and the Future beyond Lexical Meaning , 2005, IEEE Trans. Pattern Anal. Mach. Intell..

[19]  P. Rousseeuw,et al.  A fast algorithm for the minimum covariance determinant estimator , 1999 .

[20]  W. Sandler Dedicated gestures and the emergence of sign language , 2012 .

[21]  Christian Wolf,et al.  Multi-scale Deep Learning for Gesture Detection and Localization , 2014, ECCV Workshops.

[22]  Jemina Napier,et al.  Interpreting into International Sign Pidgin: An analysis , 2002 .

[23]  Wendy Sandler,et al.  Sign Language and Linguistic Universals: Entering the lexicon: lexicalization, backformation, and cross-modal borrowing , 2006 .

[24]  Alex Pentland,et al.  Real-Time American Sign Language Recognition Using Desk and Wearable Computer Based Video , 1998, IEEE Trans. Pattern Anal. Mach. Intell..

[25]  Thad Starner,et al.  American sign language recognition with the kinect , 2011, ICMI '11.

[26]  Nadir Weibel,et al.  ChronoViz: a system for supporting navigation of time-coded data , 2011, CHI Extended Abstracts.

[27]  Jean-Michel Poggi,et al.  Multivariate denoising using wavelets and principal component analysis , 2006, Comput. Stat. Data Anal..

[28]  Hermann Ney,et al.  Signspeak--understanding, recognition, and translation of sign languages , 2010 .

[29]  Richard Bowden,et al.  Sign Language Recognition , 2011, Visual Analysis of Humans.

[30]  Jakub Segen,et al.  Shadow gestures: 3D hand pose estimation using a single camera , 1999, Proceedings. 1999 IEEE Computer Society Conference on Computer Vision and Pattern Recognition (Cat. No PR00149).

[31]  S. Goldin‐Meadow Talking and Thinking With Our Hands , 2006 .

[32]  Dimitris N. Metaxas,et al.  ASL recognition based on a coupling between HMMs and 3D motion analysis , 1998, Sixth International Conference on Computer Vision (IEEE Cat. No.98CH36271).