Evaluating a biosensor-based interface to recognize hand-finger gestures using a Myo armband

Gesture recognition is a convenient and natural Human-Computer Interaction (HCI) technique. Recent advances in bioengineering have seen the use of biosensor technologies in HCI, since various biosensors provide real-time feedback from biological activities. This has enabled User Interface (UI) designers to design more natural UIs, including the use of a Muscle-Computer Interface (MCI). This paper presents an evaluation of an MCI designed to recognize four-finger pinching, fist action and five-hand-fingers spread gestures using Electromyography (EMG) signals from a Myo armband from Thalmic Labs Inc.™. An experimental research strategy was used and a Feedforward Neural Network was implemented to classify the gestures and each of the gestures was trained for 3 seconds. An average 95% success rate for completing gesture-posing tasks among 6 participants was achieved with an average predicting error value of 14.48, expressed by the Root Mean Square Error (RMSE). The results illustrate the application of biosensors in gesture recognition as a modern and reliable approach which benefits the HCI. Biosensor-based gesture recognition provided a greater level of accessibility and encumbered the users less when this approach was compared with vision- and sensor-based approaches. The Myo armband showed that it is an economical and standard bio-sensing wearable device that can be successfully used in hand-finger gesture recognition.

[1]  P. Komi,et al.  Motor unit activation patterns during isometric, concentric and eccentric actions at different force levels. , 2003, Journal of electromyography and kinesiology : official journal of the International Society of Electrophysiological Kinesiology.

[2]  Christof Lutteroth,et al.  A quantitative quality model for gesture based user interfaces , 2011, OZCHI.

[3]  P. Pochet A Quantitative Analysis , 2006 .

[4]  J J Vidal,et al.  Toward direct brain-computer communication. , 1973, Annual review of biophysics and bioengineering.

[5]  Midori Kitagawa,et al.  Facial Motion Capture , 2020, MoCap for Artists.

[6]  Mithileysh Sathiyanarayanan,et al.  Map Navigation Using Hand Gesture Recognition: A Case Study Using MYO Connector on Apple Maps , 2015 .

[7]  Roy M. Chiulli Quantitative Analysis : An Introduction , 2018 .

[8]  Guan-Chun Luh,et al.  Muscle-gesture robot hand control based on sEMG signals with wavelet transform features and neural network classifier , 2016, 2016 International Conference on Machine Learning and Cybernetics (ICMLC).

[9]  Desney S. Tan,et al.  Enabling always-available input with muscle-computer interfaces , 2009, UIST '09.

[10]  Zheng Wang,et al.  A soft stretchable bending sensor and data glove applications , 2016, Robotics and biomimetics.

[11]  Martin Saerbeck,et al.  Gesture Recognition Performance Score: A New Metric to Evaluate Gesture Recognition Systems , 2014, ACCV Workshops.

[12]  Kishan G. Mehrotra,et al.  Elements of artificial neural networks , 1996 .

[13]  Amélie Rochet-Capellan,et al.  Measuring the linear and rotational user precision in touch pointing , 2012, ITS.

[14]  I. A. Sulistijono,et al.  Comparison of five time series EMG features extractions using Myo Armband , 2015, 2015 International Electronics Symposium (IES).

[15]  Kapil D. Katyal,et al.  Individual finger control of a modular prosthetic limb using high-density electrocorticography in a human subject , 2016, Journal of neural engineering.

[16]  Thea van der Geest,et al.  Developing Accessibility Design Guidelines for Wearables: Accessibility Standards for Multimodal Wearable Devices , 2016, HCI.

[17]  Doreswamy,et al.  Performance Analysis Of Neural Network Models For Oxazolines And Oxazoles Derivatives Descriptor Dataset , 2013, ArXiv.

[18]  Meenakshi Panwar,et al.  Hand gesture recognition for human computer interaction , 2011, 2011 International Conference on Image Information Processing.

[19]  Clodoaldo Ap. M. Lima,et al.  Assessing fractal dimension methods as feature extractors for EMG signal classification , 2014, Eng. Appl. Artif. Intell..

[20]  Pourang Irani,et al.  CrashAlert: enhancing peripheral alertness for eyes-busy mobile interaction while walking , 2013, CHI.

[21]  Andrew Y. Paek,et al.  Global cortical activity predicts shape of hand during grasping , 2015, Front. Neurosci..

[22]  Wen Gao,et al.  A vision-based sign language recognition system using tied-mixture density HMM , 2004, ICMI '04.

[23]  Thomas B. Moeslund,et al.  A Procedure for Developing Intuitive and Ergonomic Gesture Interfaces for HCI , 2003, Gesture Workshop.

[24]  Karel Brookhuis,et al.  Do In-car Devices Affect Experienced Users' Driving Performance? , 2015 .

[25]  A.L.P Madushanka,et al.  Framework for Sinhala Sign Language recognition and translation using a wearable armband , 2016, 2016 Sixteenth International Conference on Advances in ICT for Emerging Regions (ICTer).

[26]  Kang-Hee Lee,et al.  Study on Virtual Control of a Robotic Arm via a Myo Armband for the Self- Manipulation of a Hand Amputee , 2016 .

[27]  Janusz Sobecki,et al.  Gesture tracking and recognition in touchscreens usability testing , 2013, MIDI '13.

[28]  Eliane C Magdalon,et al.  Comparison of grasping movements made by healthy subjects in a 3-dimensional immersive virtual versus physical environment. , 2011, Acta psychologica.

[29]  Olga Sourina,et al.  Real-Time EEG-Based Emotion Recognition and Its Applications , 2011, Trans. Comput. Sci..

[30]  Harpreet Kaur,et al.  A review: Study of various techniques of Hand gesture recognition , 2016, 2016 IEEE 1st International Conference on Power Electronics, Intelligent Control and Energy Systems (ICPEICES).

[31]  A. Kendon Gesture: Visible Action as Utterance , 2004 .

[32]  Qiang Li,et al.  A Hand Gesture Recognition Framework and Wearable Gesture-Based Interaction Prototype for Mobile Devices , 2014, IEEE Transactions on Human-Machine Systems.

[33]  Tomasz Imielinski,et al.  Mobile Computing , 1996 .

[34]  K. K. Sahu,et al.  Normalization: A Preprocessing Stage , 2015, ArXiv.

[35]  Hui Deng,et al.  A Survey on Hand Gesture Recognition , 2013, 2013 International Conference on Computer Sciences and Applications.

[36]  John T. Stasko,et al.  Mobile computing in the retail arena , 2003, CHI '03.

[37]  Guan-Chun Luh,et al.  Intuitive muscle-gesture based robot navigation control using wearable gesture armband , 2015, 2015 International Conference on Machine Learning and Cybernetics (ICMLC).

[38]  Osman Hasan,et al.  Wearable technologies for hand joints monitoring for rehabilitation: A survey , 2018, Microelectron. J..

[39]  Shitij Kumar,et al.  Recognition of human arm gestures using Myo armband for the game of hand cricket , 2017, 2017 IEEE International Symposium on Robotics and Intelligent Sensors (IRIS).

[40]  Narit Hnoohom,et al.  An Integration of Myo Armbands and an Android-Based Mobile Application for Communication with Hearing-Impaired Persons , 2017, 2017 13th International Conference on Signal-Image Technology & Internet-Based Systems (SITIS).

[41]  Bin He,et al.  Noninvasive Electroencephalogram Based Control of a Robotic Arm for Reach and Grasp Tasks , 2016, Scientific Reports.

[42]  Chenguang Yang,et al.  Hand gesture recognition using MYO armband , 2017, 2017 Chinese Automation Congress (CAC).

[43]  Anind K. Dey,et al.  Serendipity: Finger Gesture Recognition using an Off-the-Shelf Smartwatch , 2016, CHI.

[44]  Jan Carlo Barca,et al.  Exploring Sensor Gloves for Teaching Children Sign Language , 2012, Adv. Hum. Comput. Interact..

[45]  Z. Hasan A Survey on Shari’Ah Governance Practices in Malaysia, GCC Countries and the UK , 2011 .

[46]  Vesa Linnamo,et al.  Motor unit activation and force production during eccentric, concentric and isometric actions , 2002 .

[47]  Gamini Dissanayake,et al.  Muscle computer interfaces for driver distraction reduction , 2013, Comput. Methods Programs Biomed..