This paper presents an early exploration of the suitability of the Leap Motion controller for Australian Sign Language (Auslan) recognition. Testing showed that the controller is able to provide accurate tracking of hands and fingers, and to track movement. This detection loses accuracy when the hand moves into a position that obstructs the controller's ability to view, such as when the hand rotates and is perpendicular to the controller. The detection also fails when individual elements of the hands are brought together, such as finger to finger. In both of these circumstances, the controller is unable to read or track the hand. There is potential for the use of this technology for recognising Auslan, however further development of the Leap Motion API is required.
[1]
Mohammed Waleed Kadous,et al.
Machine Recognition of Auslan Signs Using PowerGloves: Towards Large-Lexicon Recognition of Sign Lan
,
1996
.
[2]
Robyn A. Owens,et al.
Visual Sign Language Recognition
,
2000,
Theoretical Foundations of Computer Vision.
[3]
Leigh Ellen Potter,et al.
Sign my world: lessons learned from prototyping sessions with young deaf children
,
2012,
OZCHI.
[4]
M. P. Moeller,et al.
Early intervention and language development in children who are deaf and hard of hearing.
,
2000,
Pediatrics.
[5]
Peter Vamplew.
Recognition of sign language gestures using neural networks
,
1996
.