Android Based American Sign Language Recognition System with Skin Segmentation and SVM

People with hearing impairment use sign language for communication. They use hand gestures to represent numbers, letters, words and sentences, which allows them to communicate among themselves. The problem arises when they need to interact with other people. An automation system that can convert sign language to text will make the interaction easier. Recently, many such systems for sign language recognition have been developed. But most of them were executed using laptop and computers, which are impractical to carry due to their weight and size. This article is based on the design and implementation of an Android application which converts the American Sign Language to text, so that it can be used anywhere and anytime. Image is captured by the smart phone camera and skin segmentation is done using YCbCr systems. Features are extracted from the image using HOG and classified to recognize the sign. The classification is done using Support Vector Machine (SVM).

[1]  Benjamin Bahan,et al.  Non-manual realization of agreement in American sign language , 1996 .

[2]  Fahim Dalvi,et al.  Sign Language Recognition Using Temporal Classification , 2017, ArXiv.

[3]  M. Mahesh,et al.  Sign language translator for mobile platforms , 2017, 2017 International Conference on Advances in Computing, Communications and Informatics (ICACCI).

[4]  Rudy Hartanto,et al.  Android based real-time static Indonesian sign language recognition system prototype , 2016, 2016 8th International Conference on Information Technology and Electrical Engineering (ICITEE).

[5]  M. Jayaraju,et al.  Spotting and recognition of hand gesture for Indian sign language recognition system with skin segmentation and SVM , 2017, 2017 International Conference on Wireless Communications, Signal Processing and Networking (WiSPNET).

[6]  Khaled Assaleh,et al.  User-Dependent Sign Language Recognition Using Motion Detection , 2016, 2016 International Conference on Computational Science and Computational Intelligence (CSCI).

[7]  Rizky Yuniar Hakkun,et al.  Sign language learning based on Android for deaf and speech impaired people , 2015, 2015 International Electronics Symposium (IES).

[8]  S. Majumder,et al.  Shape, texture and local movement hand gesture features for Indian Sign Language recognition , 2011, 3rd International Conference on Trendz in Information Sciences & Computing (TISC2011).

[9]  Cleber Zanchettin,et al.  Gesture recognition: A review focusing on sign language in a mobile context , 2018, Expert Syst. Appl..