On the Use of Multi-Modal Sensing in Sign Language Classification

In literature, sign language recognition (SLR) has been proposed using multi-channel data acquisition devices with various sensing modalities. When using wearable sensors, multimodality data acquisition has been shown to be particularly useful for improving the classification accuracies as compared to single modality data acquisition. In this work, a statistical analysis is presented to quantify the performance of different combinations of wearable sensors such as surface electromyogram (sEMG), accelerometers and gyroscopes in the classification of isolated signs. Twelve signs from the Indian sign language are considered such that the signs consist of static hand postures, as well as complex motion of forearm and simple wrist motions. Following four combinations of sensor modalities are compared for classification accuracies using statistical tests: 1) accelerometer and gyroscope 2) sEMG and accelerometer, 3) sEMG and gyroscope and finally, 4) sEMG, accelerometer and gyroscope. Results obtained on actual data indicate that the combination of all three modalities, namely sEMG, accelerometer and gyroscope yield the best classification accuracy of 88.25% as compared to the remaining sensor combinations. However, the statistical analysis of the classification accuracies using analysis of variance (ANOVA) indicates that the use of sEMG sensors is particularly useful in the classification of static hand postures. Moreover, the classification of signs involving dynamic motion of hands either with simple wrist motion or motion of hand along a complex trajectory is comparatively better with any sensing modality as compared to the classification of static hand postures.

[1]  Manfredo Atzori,et al.  Electromyography data for non-invasive naturally-controlled robotic hand prostheses , 2014, Scientific Data.

[2]  D. Rajan Probability, Random Variables, and Stochastic Processes , 2017 .

[3]  Tanja Schultz,et al.  Recognizing Hand and Finger Gestures with IMU based Motion and EMG based Muscle Activity Sensing , 2015, BIOSIGNALS.

[4]  Norbert Schmitz,et al.  Survey of Motion Tracking Methods Based on Inertial Sensors: A Focus on Upper Limb Human Motion , 2017, Sensors.

[5]  Angelo M. Sabatini,et al.  Machine Learning Methods for Classifying Human Physical Activity from On-Body Accelerometers , 2010, Sensors.

[6]  Rinki Gupta,et al.  On the Use of Temporal and Spectral Central Moments of Forearm Surface EMG for Finger Gesture Classification , 2018, 2018 2nd International Conference on Micro-Electronics and Telecommunication Engineering (ICMETE).

[7]  S.Y. Lee,et al.  Accelerometer's position free human activity recognition using a hierarchical recognition model , 2010, The 12th IEEE International Conference on e-Health Networking, Applications and Services.

[8]  Kosuke Sato,et al.  Human motion capture by integrating gyroscopes and accelerometers , 1996, 1996 IEEE/SICE/RSJ International Conference on Multisensor Fusion and Integration for Intelligent Systems (Cat. No.96TH8242).

[9]  Pornchai Phukpattaranont,et al.  Feature reduction and selection for EMG signal classification , 2012, Expert Syst. Appl..

[10]  Xinjun Sheng,et al.  Towards Chinese sign language recognition using surface electromyography and accelerometers , 2017, 2017 24th International Conference on Mechatronics and Machine Vision in Practice (M2VIP).

[11]  Angelica Munoz-Melendez,et al.  Wearable Inertial Sensors for Human Motion Analysis: A Review , 2016, IEEE Sensors Journal.

[12]  Roozbeh Jafari,et al.  A Wearable System for Recognizing American Sign Language in Real-Time Using IMU and Surface EMG Sensors , 2016, IEEE Journal of Biomedical and Health Informatics.

[13]  Manfredo Atzori,et al.  Spatial Registration of Hand Muscle Electromyography Signals , 2012 .