Two gesture recognition systems for immersive math education of the deaf

The general goal of our research is the creation of a natural and intuitive interface for navigation, interaction, and input/recognition of American Sign Language (ASL) math signs in immersive Virtual Environments (VE) for the Deaf. The specific objective of this work is the development of two new gesture recognition systems for SMILE™, an immersive learning game that employs a fantasy 3D virtual environment to engage deaf children in math-based educational tasks. Presently, SMILE includes standard VR interaction devices such as a 6DOF wand, a pair of pinch gloves, and a dance platform. In this paper we show a significant improvement of the application by proposing two new gesture control mechanisms: system (1) is based entirely on hand gestures and makes use of a pair of 18-sensor data gloves, system (2) is based on hand and body gestures and makes use of a pair of data gloves and a motion tracking system. Both interfaces support first-person motion control, object selection and manipulation, and real-time input/recognition of ASL numbers zero to twenty. Although the systems described in the paper rely on high-end, expensive hardware, they can be considered a first step toward the realization of an effective immersive sign language interface.

[1]  Luis Enrique Sucar,et al.  Dynamic Bayesian networks for visual recognition of dynamic gestures , 2002, J. Intell. Fuzzy Syst..

[2]  Christine Youngblut,et al.  Educational Uses of Virtual Reality Technology. , 1998 .

[3]  Y. Guiard Asymmetric division of labor in human skilled bimanual action: the kinematic chain as a model. , 1987, Journal of motor behavior.

[4]  Surendra Ranganath,et al.  Automatic Sign Language Analysis: A Survey and the Future beyond Lexical Meaning , 2005, IEEE Trans. Pattern Anal. Mach. Intell..

[5]  Vladimir Pavlovic,et al.  Hand Gesture Modeling, Analysis, and Synthesis , 1995 .

[6]  Frederick C. Harris,et al.  Real-time natural hand gestures , 2005, Comput. Sci. Eng..

[7]  Alex Pentland,et al.  Real-time American Sign Language recognition from video using hidden Markov models , 1995 .

[8]  Clayton Valli,et al.  Linguistics of American Sign Language: A Resource Text for Asl Users , 1992 .

[9]  Nicoletta Adamo-Villani,et al.  SMILE: an immersive learning game for deaf and hearing children , 2007, SIGGRAPH '07.

[10]  J. Michael Moshell,et al.  A Two-Handed Interface for Object Manipulation in Virtual Environments , 1995, Presence: Teleoperators & Virtual Environments.

[11]  Penny Standen,et al.  Using virtual environments in special education , 1995 .

[12]  Joseph L. Gabbard A Taxonomy of Usability Characteristics in Virtual Environments , 1997 .

[13]  Peter Vamplew Recognition of sign language gestures using neural networks , 1996 .

[14]  Tomohiro Kuroda,et al.  Consumer price data-glove for sign language recognition , 2004 .

[15]  Nicoletta Adamo-Villani,et al.  An immersive virtual environment for learning sign language mathematics , 2006, SIGGRAPH '06.

[16]  David Geer Will Gesture-Recognition Technology Point the Way? , 2004, Computer.

[17]  Van R. Culver A hybrid sign language recognition system , 2004, Eighth International Symposium on Wearable Computers.