Real-Time Malaysian Sign Language Translation using Colour Segmentation and Neural Network

In this paper we present an automatic visual-based sign language translation system. Our proposed automatic sign-language translator provides a real-time English translation of the Malaysia SL. To date, there have been studies on sign language recognition based on visual approach (video camera). However, the emphasis on these works is limited to a small lexicon of sign language or solely focuses on fingerspelling, which takes different approaches respectively. In practical sense, fingerspelling is used if a word cannot be expressed via sign language. Our sign language translator can recognise both fingerspelling and sign gestures that involve static and motion signs. Trained neural networks are used to identify the signs to translate into English.

[1]  N. Chotikakamthorn,et al.  Improved dynamic gesture segmentation for Thai sign language translation , 2004, Proceedings 7th International Conference on Signal Processing, 2004. Proceedings. ICSP '04. 2004..

[2]  M. V. Lamar Hand Gesture Recognition Using T-CombNET : A New Neural Network Model , 2000 .

[3]  Wen Gao,et al.  Expanding Training Set for Chinese Sign Language Recognition , 2006, 7th International Conference on Automatic Face and Gesture Recognition (FGR06).

[4]  Eun-Jung Holden,et al.  Visual recognition of hand motion , 1997 .

[5]  Thad Starner,et al.  Visual Recognition of American Sign Language Using Hidden Markov Models. , 1995 .

[6]  Surendra Ranganath,et al.  Automatic Sign Language Analysis: A Survey and the Future beyond Lexical Meaning , 2005, IEEE Trans. Pattern Anal. Mach. Intell..

[7]  Rafael C. González,et al.  Digital image processing using MATLAB , 2006 .

[8]  Kanad K. Biswas,et al.  Real Time Hand Tracking and Gesture Recognition , 2009, IPCV.

[9]  James Ohene-Djan,et al.  An adaptive WWW-based system to teach British sign language , 2005, Fifth IEEE International Conference on Advanced Learning Technologies (ICALT'05).