Glove-Based Continuous Arabic Sign Language Recognition in User-Dependent Mode

In this paper, we propose a glove-based Arabic sign language recognition system using a novel technique for sequential data classification. We compile a sensor-based dataset of 40 sentences using an 80-word lexicon. In the dataset, hand movements are captured using two DG5-VHand data gloves. Data labeling is performed using a camera to synchronize hand movements with their corresponding sign language words. Low-complexity preprocessing and feature extraction techniques are applied to capture and emphasize the temporal dependence of the data. Subsequently, a Modified k-Nearest Neighbor (MKNN) approach is used for classification. The proposed MKNN makes use of the context of feature vectors for the purpose of accurate classification. The proposed solution achieved a sentence recognition rate of 98.9%. The results are compared against an existing vision-based approach that uses the same set of sentences. The proposed solution is superior in terms of classification rates while eliminating restrictions of vision-based systems.

[1]  M. Deriche,et al.  Arabic sign language recognition by decisions fusion using Dempster-Shafer theory of evidence , 2013, 2013 Computing, Communications and IT Applications Conference (ComComAp).

[2]  Khaled Assaleh,et al.  Low Complexity Classification System for Glove-Based Arabic Sign Language Recognition , 2012, ICONIP.

[3]  Khaled Assaleh,et al.  Vision-based system for continuous Arabic Sign Language recognition in user dependent mode , 2008, 2008 5th International Symposium on Mechatronics and Its Applications.

[4]  Khaled Assaleh,et al.  Telescopic Vector Composition and Polar Accumulated Motion Residuals for Feature Extraction in Arabic Sign Language Recognition , 2007, EURASIP J. Image Video Process..

[5]  Khaled Assaleh,et al.  User-independent recognition of Arabic sign language for facilitating communication with the deaf community , 2011, Digit. Signal Process..

[6]  Tracy L. Westeyn,et al.  Georgia tech gesture toolkit: supporting experiments in gesture recognition , 2003, ICMI '03.

[7]  M. Mohandes,et al.  Image-Based and Sensor-Based Approaches to Arabic Sign Language Recognition , 2014, IEEE Transactions on Human-Machine Systems.

[8]  Wen Gao,et al.  A Chinese sign language recognition system based on SOFM/SRN/HMM , 2004, Pattern Recognit..

[9]  Khaled Assaleh,et al.  Recognition of Arabic Sign Language Alphabet Using Polynomial Classifiers , 2005, EURASIP J. Adv. Signal Process..

[10]  Surendra Ranganath,et al.  Towards subject independent continuous sign language recognition: A segment and merge approach , 2014, Pattern Recognit..

[11]  Surendra Ranganath,et al.  Automatic Sign Language Analysis: A Survey and the Future beyond Lexical Meaning , 2005, IEEE Trans. Pattern Anal. Mach. Intell..

[12]  Khaled Assaleh,et al.  Two Tier Feature Extractions for Recognition of Isolated Arabic Sign Language using Fisher's Linear Discriminants , 2007, 2007 IEEE International Conference on Acoustics, Speech and Signal Processing - ICASSP '07.

[13]  Tamer Shanableh,et al.  Spatio-Temporal Feature-Extraction Techniques for Isolated Gesture Recognition in Arabic Sign Language , 2007, IEEE Transactions on Systems, Man, and Cybernetics, Part B (Cybernetics).

[14]  Mohamed Mohandes,et al.  Recognition of Two-Handed Arabic Signs Using the CyberGlove , 2013 .

[15]  Mohamed A. Deriche,et al.  Arabic sign language recognition using the leap motion controller , 2014, 2014 IEEE 23rd International Symposium on Industrial Electronics (ISIE).

[16]  Faruq A. Al-Omari,et al.  IMPROVING GESTURE RECOGNITION IN THE ARABIC SIGN LANGUAGE USING TEXTURE ANALYSIS , 2007, Appl. Artif. Intell..

[17]  Kongqiao Wang,et al.  A Framework for Hand Gesture Recognition Based on Accelerometer and EMG Sensors , 2011, IEEE Transactions on Systems, Man, and Cybernetics - Part A: Systems and Humans.