Motion quaternion-based motion estimation method of MYO using K-means algorithm and Bayesian probability

AbstractThere are diverse types of devices based on natural user interface/experience for humanized computing. One such device, the MYO allows the measurement of arm motions and uses them as an interface based on gestures. There are several research works for measuring the arm motions using MYOs. For example, one of the studies defines two types of motions for a forearm and for an upper arm, respectively. The orientations of the two types are measured by two MYOs. Bayesian probabilities are calculated based on the measured orientations and are utilized to estimate the orientations of the upper arm that is not being measured. However, because the orientation of the MYO can be expressed by one quaternion, the Bayesian probability by quaternions is more accurate than the Bayesian probability by each element of quaternions. This paper proposes a motion estimation method to increase the accuracy of motion estimation. The orientations obtained from MYO are expressed by one quaternion and are clustered by K-means. In the experiments, the performance of the proposed method was validated by analyzing the difference between estimated motion quaternions and measured motion quaternions, which showed enhanced performance.

[1]  Hyungmin Kim,et al.  An Implantable Wireless Neural Interface System for Simultaneous Recording and Stimulation of Peripheral Nerve with a Single Cuff Electrode , 2017, Sensors.

[2]  Simon Fong,et al.  A 3D localisation method in indoor environments for virtual reality applications , 2017, Human-centric Computing and Information Sciences.

[3]  Nor Kamariah Noordin,et al.  Finger Triggered Virtual Musical Instruments , 2012 .

[4]  Jong Hyuk Park,et al.  Unmanned Aerial Vehicle Flight Point Classification Algorithm Based on Symmetric Big Data , 2016, Symmetry.

[5]  Frank Weichert,et al.  Analysis of the Accuracy and Robustness of the Leap Motion Controller , 2013, Sensors.

[6]  Wanqing Li,et al.  Action recognition based on a bag of 3D points , 2010, 2010 IEEE Computer Society Conference on Computer Vision and Pattern Recognition - Workshops.

[7]  Kyungeun Cho,et al.  Genetic Algorithm-Based Motion Estimation Method using Orientations and EMGs for Robot Controls , 2018, Sensors.

[8]  Mithileysh Sathiyanarayanan,et al.  Map Navigation Using Hand Gesture Recognition: A Case Study Using MYO Connector on Apple Maps , 2015 .

[9]  Niels Henze,et al.  Gesture recognition with a Wii controller , 2008, TEI.

[10]  D. Majoe,et al.  Tai Chi motion recognition, embedding the HMM method on a wearable , 2009, 2009 Joint Conferences on Pervasive Computing (JCPC).

[11]  Jake Araullo,et al.  The Leap Motion controller: a view on sign language , 2013, OZCHI.

[12]  Seong-Young Ko,et al.  A Measurement System for 3D Hand-Drawn Gesture with a PHANToMTM Device , 2010, J. Inf. Process. Syst..

[13]  Mithileysh Sathiyanarayanan,et al.  MYO Armband for physiotherapy healthcare: A case study using gesture recognition application , 2016, 2016 8th International Conference on Communication Systems and Networks (COMSNETS).

[14]  IlJu Ko,et al.  A climbing motion recognition method using anatomical information for screen climbing games , 2017, Human-centric Computing and Information Sciences.

[15]  Weiqiang Zhang,et al.  Genetic algorithm-based adaptive weight decision method for motion estimation framework , 2018, The Journal of Supercomputing.

[16]  Jong Hyuk Park,et al.  Motion estimation framework and authoring tools based on MYOs and Bayesian probability , 2016, Multimedia Tools and Applications.

[17]  Wei Wei,et al.  Vision-Based Human Motion Recognition: A Survey , 2009, 2009 Second International Conference on Intelligent Networks and Intelligent Systems.

[18]  Jong Hyuk Park,et al.  Bayesian Probability-Based Motion Estimation Method in Ubiquitous Computing Environments , 2015, CSA/CUTE.

[19]  Sae-Bom Lee,et al.  A Design and Implementation of Natural User Interface System Using Kinect , 2014 .

[20]  Suh Dongsoo,et al.  A Study on Interactive Video Installation Based on Kinect with Continuous Silhouette Line Drawings of Body Movements - Based on the Work , 2015 .

[21]  Kyoungsu Oh,et al.  Interactive Experience Room Using Infrared Sensors and User's Poses , 2017, J. Inf. Process. Syst..

[22]  Cristina V. Lopes,et al.  Free-hand interaction with leap motion controller for stroke rehabilitation , 2014, CHI Extended Abstracts.