Wireless Wearable for Sign Language Translator Device using Intel UP Squared (UP 2 ) Board

Sign language translator devices translate hand gestures into text or voice that allow interactive communication between deaf and hearing people without the reliance on human interpreters. The main focus of this work is the development of a wireless wearable device for a sign language translator using an Intel UP Squared (UP2) board. The developed device is consists of a wearable glove-based wearable and a display device using an Intel UP2 board. When hand gestures have been created by a user, the accelerometer and flex sensors in the wearable are able to measure the gestures and conveyed the data to an Arduino Nano microcontroller. The microcontroller translates the gestures into text, and then transmits it wirelessly to the UP2 board, subsequently displays the text on an LCD. In this article, the developed hardware, circuit diagrams as well as the preliminary experimental results are presented, showing the performance of the device, while demonstrating how the Intel UP2 board can be connected to a low-cost Arduino microcontroller wirelessly via Bluetooth communication.

[1]  Hermann Ney,et al.  Speech recognition techniques for a sign language recognition system , 2007, INTERSPEECH.

[2]  Mohd Helmy Abd Wahab,et al.  Preliminary Design of a Dual-Sensor Based Sign Language Translator Device , 2018, SCDM.

[3]  Wang Jingqiu,et al.  An ARM-based embedded gesture recognition system using a data glove , 2014, The 26th Chinese Control and Decision Conference (2014 CCDC).

[4]  Jakub Galka,et al.  Inertial Motion Sensing Glove for Sign Language Gesture Acquisition and Recognition , 2016, IEEE Sensors Journal.

[5]  Vineeth Vijayaraghavan,et al.  Low-cost intelligent gesture recognition engine for audio-vocally impaired individuals , 2014, IEEE Global Humanitarian Technology Conference (GHTC 2014).

[6]  Karl-Friedrich Kraiss,et al.  Video-based sign recognition using self-organizing subunits , 2002, Object recognition supported by user interaction for service robots.

[7]  T.D. Bui,et al.  Recognizing Postures in Vietnamese Sign Language With MEMS Accelerometers , 2007, IEEE Sensors Journal.

[8]  Nor Azan Mat Zin,et al.  Accessible courseware for kids with hearing impaired (‘MudahKiu’): A preliminary analysis , 2011, 2011 International Conference on Pattern Analysis and Intelligence Robotics.