Estimation of Grasp States in Prosthetic Hands using Deep Learning

The estimation of grasp states in myoelectric prosthetic hands is relevant for ergonomic interfacing, control and rehabilitation initiatives. In this paper we evaluate the possibility to infer the grasp state of a prosthetic hand from RGB frames by using well-known deep learning architectures in testing scenarios involving variations of brightness, contrast and flips. Our results show the feasibility, the attractive accuracy and efficiency to estimate prosthetic hand poses with a GoogLeNet-based deep architecture using relatively few training frames.

[1]  Tae-Kyun Kim,et al.  Latent Regression Forest: Structured Estimation of 3D Hand Poses , 2017, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[2]  Dumitru Erhan,et al.  Going deeper with convolutions , 2014, 2015 IEEE Conference on Computer Vision and Pattern Recognition (CVPR).

[3]  Tomoyuki Miyashita,et al.  Numerical Representation of Modular Graphs , 2018, 2018 IEEE 42nd Annual Computer Software and Applications Conference (COMPSAC).

[4]  Tsukasa Ogasawara,et al.  Hand pose estimation and motion recognition using egocentric RGB-D video , 2017, 2017 IEEE International Conference on Robotics and Biomimetics (ROBIO).

[5]  Ali Sekmen,et al.  A deep learning approach for final grasping state determination from motion trajectory of a prosthetic hand , 2019 .

[6]  John-John Cabibihan,et al.  Data for benchmarking low-cost, 3D printed prosthetic hands , 2019, Data in brief.

[7]  Tomoyuki Miyashita,et al.  On Graph Representation with Smallest Numerical Encoding , 2018, 2018 IEEE 42nd Annual Computer Software and Applications Conference (COMPSAC).

[8]  Patrick van der Smagt,et al.  Surface EMG in advanced hand prosthetics , 2008, Biological Cybernetics.

[9]  Daniel Thalmann,et al.  Real-Time 3D Hand Pose Estimation with 3D Convolutional Neural Networks , 2019, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[10]  Shyamanta M. Hazarika,et al.  Electromyographic Grasp Recognition for a Five Fingered Robotic Hand , 2012, ICRA 2012.

[11]  Graham Morgan,et al.  An exploratory study on the use of convolutional neural networks for object grasp classification , 2015 .

[12]  Manfredo Atzori,et al.  Visual Cues to Improve Myoelectric Control of Upper Limb Prostheses , 2018, 2018 7th IEEE International Conference on Biomedical Robotics and Biomechatronics (Biorob).

[13]  Victor Parque,et al.  Bijections for the numeric representation of labeled graphs , 2014, 2014 IEEE International Conference on Systems, Man, and Cybernetics (SMC).

[14]  Tomoyuki Miyashita,et al.  On the Numerical Representation of Labeled Graphs with Self-Loops , 2017, 2017 IEEE 29th International Conference on Tools with Artificial Intelligence (ICTAI).

[15]  Li-Chen Fu,et al.  Learning a deep network with spherical part model for 3D hand pose estimation , 2017, 2017 IEEE International Conference on Robotics and Automation (ICRA).

[16]  Sang Ho Yoon,et al.  Robust Hand Pose Estimation during the Interaction with an Unknown Object , 2017, 2017 IEEE International Conference on Computer Vision (ICCV).

[17]  Vincent Lepetit,et al.  Efficiently Creating 3D Training Data for Fine Hand Pose Estimation , 2016, 2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR).

[18]  Geoffrey E. Hinton,et al.  ImageNet classification with deep convolutional neural networks , 2012, Commun. ACM.

[19]  Tomoyuki Miyashita,et al.  Affordable Sensor Fusion for Wireless Control of External Devices , 2015 .

[20]  Graham Morgan,et al.  Deep learning-based artificial vision for grasp classification in myoelectric hands , 2017, Journal of neural engineering.

[21]  Victor Parque,et al.  Neural Computing with Concurrent Synchrony , 2014, ICONIP.

[22]  Carlos R. del-Blanco,et al.  Tiny hand gesture recognition without localization via a deep convolutional network , 2017, IEEE Transactions on Consumer Electronics.

[23]  S. Micera,et al.  Classification of upper arm EMG signals during object-specific grasp , 2008, 2008 30th Annual International Conference of the IEEE Engineering in Medicine and Biology Society.

[24]  Daniel Thalmann,et al.  3D Convolutional Neural Networks for Efficient and Robust Hand Pose Estimation from Single Depth Images , 2017, 2017 IEEE Conference on Computer Vision and Pattern Recognition (CVPR).

[25]  Tomoyuki Miyashita,et al.  On succinct representation of directed graphs , 2017, 2017 IEEE International Conference on Big Data and Smart Computing (BigComp).

[26]  Deniz Erdogmus,et al.  HANDS: a multimodal dataset for modeling toward human grasp intent inference in prosthetic hands , 2019, Intelligent Service Robotics.

[27]  Hanqing Lu,et al.  EgoGesture: A New Dataset and Benchmark for Egocentric Hand Gesture Recognition , 2018, IEEE Transactions on Multimedia.

[28]  Sridhar P. Arjunan,et al.  Prosthetic hand control: A multidisciplinary review to identify strengths, shortcomings, and the future , 2019, Biomed. Signal Process. Control..

[29]  Luc Van Gool,et al.  Crossing Nets: Combining GANs and VAEs with a Shared Latent Space for Hand Pose Estimation , 2017, 2017 IEEE Conference on Computer Vision and Pattern Recognition (CVPR).