K-nearest correlated neighbor classification for Indian sign language gesture recognition using feature fusion

A sign language recognition system is an attempt to bring the speech and the hearing impaired community closer to more regular and convenient forms of communication. Thus, this system requires to recognize the gestures from a sign language and convert them to a form easily understood by the hearing. The model that has been proposed in this paper recognizes static images of the signed alphabets in the Indian Sign Language. Unlike the alphabets in other sign languages like the American Sign Language and the Chinese Sign language, the ISL alphabet are both single-handed and double-handed. Hence, to make recognition easier the model first categorizes them as single-handed or double-handed. For both categories two kinds of features, namely HOG and SIFT, are extracted for a set of training images and are combined in a single matrix. After which, HOG and SIFT features for the input test image are combined with the HOG and SIFT feature matrices of the training set. Correlation is computed for these matrices and is fed to a K-Nearest Neighbor Classifier to obtain the resultant classification of the test image.

[1]  S. Majumder,et al.  Shape, texture and local movement hand gesture features for Indian Sign Language recognition , 2011, 3rd International Conference on Trendz in Information Sciences & Computing (TISC2011).

[2]  Anand Singh Jalal,et al.  Recognition of Indian Sign Language using feature fusion , 2012, 2012 4th International Conference on Intelligent Human Computer Interaction (IHCI).

[3]  Sun Quan,et al.  The Theory of Canonical Correlation Analysis and Its Application to Feature Fusion , 2005 .

[4]  M.K. Bhuyan,et al.  A Framework for Hand Gesture Recognition with Applications to Sign Language , 2006, 2006 Annual IEEE India Conference.

[5]  Ankush Mittal,et al.  Indian Sign Language gesture classification as single or double handed gestures , 2015, 2015 Third International Conference on Image Information Processing (ICIIP).

[6]  Bill Triggs,et al.  Histograms of oriented gradients for human detection , 2005, 2005 IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR'05).

[7]  Garima Joshi,et al.  Static hand gestures recognition system using shape based features , 2014, 2014 Recent Advances in Engineering and Computational Sciences (RAECS).

[8]  V. Adithya,et al.  Artificial neural network based method for Indian sign language recognition , 2013, 2013 IEEE CONFERENCE ON INFORMATION AND COMMUNICATION TECHNOLOGIES.

[9]  Fahad Shahbaz Khan,et al.  Fusing Color and Shape for Bag-of-Words Based Object Recognition , 2013, CCIW.