Improvements in a Wearable Device for Sign Language Translation

Nowadays a commercial product for sign language translation is still not available. This paper presents our latest results towards this goal, presenting a functional prototype called Talking Hands. Talking Hands uses a data-glove to detect the hand movements of the user, and a smartphone application to gather all the data and translates them into voice, using a speech synthesizer. Talking Hands implements the most suitable solutions for a massive production without penalizing its reliability. This paper presents the improvements of the last prototype in terms of hardware, software and design, together with a preliminary analysis for the translation of dynamic gestures through this device.

[1]  Ho-Sub Yoon,et al.  Hand gesture recognition using combined features of location, angle and velocity , 2001, Pattern Recognit..

[2]  Du Q. Huynh,et al.  Metrics for 3D Rotations: Comparison and Analysis , 2009, Journal of Mathematical Imaging and Vision.

[3]  Nicolas Pugeault,et al.  Reading the signs: A video based sign dictionary , 2011, 2011 IEEE International Conference on Computer Vision Workshops (ICCV Workshops).

[4]  Dhananjai Bajpai,et al.  Two Way Wireless Data Communication and American Sign Language Translator Glove for Images Text and Speech Display on Mobile Phone , 2015, 2015 Fifth International Conference on Communication Systems and Network Technologies.

[5]  Lih-Jen Kau,et al.  A real-time portable sign language translation system , 2015, 2015 IEEE 58th International Midwest Symposium on Circuits and Systems (MWSCAS).

[6]  Daniel Kelly,et al.  A person independent system for recognition of hand postures used in sign language , 2010, Pattern Recognit. Lett..

[7]  Ahmad Zaki Shukor,et al.  A New Data Glove Approach for Malaysian Sign Language Detection , 2015 .

[8]  Muhammad Modi bin Lakulu,et al.  A Review on Systems-Based Sensory Gloves for Sign Language Recognition State of the Art between 2007 and 2017 , 2018, Sensors.

[9]  Hyun Myung,et al.  Real-Time Human Pose Estimation and Gesture Recognition from Depth Images Using Superpixels and SVM Classifier , 2015, Sensors.

[10]  Kouichi Murakami,et al.  Gesture recognition using recurrent neural networks , 1991, CHI.

[11]  Awais M. Kamboh,et al.  American Sign Language Translation through Sensory Glove; SignSpeak , 2015 .

[12]  Surendra Ranganath,et al.  Automatic Sign Language Analysis: A Survey and the Future beyond Lexical Meaning , 2005, IEEE Trans. Pattern Anal. Mach. Intell..

[13]  Dimitris N. Metaxas,et al.  American sign language recognition: reducing the complexity of the task with phoneme-based modeling and parallel hidden markov models , 2003 .

[14]  Yannick Aoustin,et al.  A testing system for a real-time gesture classification using surface EMG , 2017 .

[15]  Luca Maria Gambardella,et al.  Max-pooling convolutional neural networks for vision-based hand gesture recognition , 2011, 2011 IEEE International Conference on Signal and Image Processing Applications (ICSIPA).

[16]  Cleber Zanchettin,et al.  Gesture recognition: A review focusing on sign language in a mobile context , 2018, Expert Syst. Appl..

[17]  Maria Letizia Corradini,et al.  Development of a Wearable Device for Sign Language Translation , 2017, HFR.

[18]  Suman Deb,et al.  Low cost tangible glove for translating sign gestures to speech and text in Hindi language , 2017, 2017 3rd International Conference on Computational Intelligence & Communication Technology (CICT).

[19]  Alex Pentland,et al.  Real-Time American Sign Language Recognition Using Desk and Wearable Computer Based Video , 1998, IEEE Trans. Pattern Anal. Mach. Intell..

[20]  Robin E. Perkins-Dock,et al.  A Survey of Barriers to Employment for Individuals who are Deaf , 2015 .

[21]  Michael Seymour,et al.  A mobile application for South African Sign Language (SASL) recognition , 2015, AFRICON 2015.