Video-based Prediction of Hand-grasp Preshaping with Application to Prosthesis Control

Among the currently available grasp-type selection techniques for hand prostheses, there is a distinct lack of intuitive, robust, low-latency solutions. In this paper we investigate the use of a portable, forearm-mounted, video-based technique for the prediction of hand-grasp preshaping for arbitrary objects. The purpose of this system is to automatically select the grasp-type for the user of the prosthesis, potentially increasing ease-of-use and functionality. This system can be used to supplement and improve existing control strategies, such as surface electromyography (sEMG) pattern recognition, for prosthetic and orthotic devices. We designed and created a suitable dataset consisting of RGB-D video data for 2212 grasp examples split evenly across 7 classes; 6 grasps commonly used in activities of daily living, and an additional no-grasp category. We processed and analyzed the dataset using several state-of-the-art deep learning architectures. Our selected model shows promising results for realistic, intuitive, real-world use, reaching per-frame accuracies on video sequences of up to 95.90% on the validation set. Such a system could be integrated into the palm of a hand prosthesis, allowing an automatic prediction of the grasp-type without requiring any special movements or aiming by the user.

[1]  Nikos Komodakis,et al.  Wide Residual Networks , 2016, BMVC.

[2]  Jürgen Schmidhuber,et al.  Learning to forget: continual prediction with LSTM , 1999 .

[3]  Ali Farhadi,et al.  YOLO9000: Better, Faster, Stronger , 2016, 2017 IEEE Conference on Computer Vision and Pattern Recognition (CVPR).

[4]  Forrest N. Iandola,et al.  SqueezeNet: AlexNet-level accuracy with 50x fewer parameters and <1MB model size , 2016, ArXiv.

[5]  Christine L. MacKenzie,et al.  Functional relationships between grasp and transport components in a prehension task , 1990 .

[6]  Aaron M. Dollar,et al.  Grasp Frequency and Usage in Daily Household and Machine Shop Tasks , 2013, IEEE Transactions on Haptics.

[7]  Graham Morgan,et al.  Deep learning-based artificial vision for grasp classification in myoelectric hands , 2017, Journal of neural engineering.

[8]  Huseyin Atakan Varol,et al.  Human grasping database for activities of daily living with depth, color and kinematic data streams , 2018, Scientific Data.

[9]  Hong Kai Yap,et al.  A soft exoskeleton for hand assistive and rehabilitation application using pneumatic actuators with variable stiffness , 2015, 2015 IEEE International Conference on Robotics and Automation (ICRA).

[10]  Dejan B Popović,et al.  Microsoft Kinect-Based Artificial Perception System for Control of Functional Electrical Stimulation Assisted Grasping , 2014, BioMed research international.

[11]  Kilian Q. Weinberger,et al.  Densely Connected Convolutional Networks , 2016, 2017 IEEE Conference on Computer Vision and Pattern Recognition (CVPR).

[12]  Strahinja Došen,et al.  Transradial prosthesis: artificial vision for control of prehension. , 2011, Artificial organs.

[13]  J. F. Soechting,et al.  Gradual molding of the hand to object contours. , 1998, Journal of neurophysiology.

[14]  Dario Farina,et al.  Stereovision and augmented reality for closed-loop control of grasping in hand prostheses , 2014, Journal of neural engineering.

[15]  Sethu Vijayakumar,et al.  Real-time classification of multi-modal sensory data for prosthetic hand control , 2016, 2016 6th IEEE International Conference on Biomedical Robotics and Biomechatronics (BioRob).

[16]  David Gschwend,et al.  ZynqNet: An FPGA-Accelerated Embedded Convolutional Neural Network , 2020, ArXiv.

[17]  Lorenzo Torresani,et al.  Learning Spatiotemporal Features with 3D Convolutional Networks , 2014, 2015 IEEE International Conference on Computer Vision (ICCV).

[18]  Guido Bugmann,et al.  Classification of Finger Movements for the Dexterous Hand Prosthesis Control With Surface Electromyography , 2013, IEEE Journal of Biomedical and Health Informatics.

[19]  Panagiotis K. Artemiadis,et al.  Proceedings of the first workshop on Peripheral Machine Interfaces: going beyond traditional surface electromyography , 2014, Front. Neurorobot..

[20]  Huosheng Hu,et al.  Myoelectric control systems - A survey , 2007, Biomed. Signal Process. Control..

[21]  Pavlo Molchanov,et al.  Hand gesture recognition with 3D convolutional neural networks , 2015, 2015 IEEE Conference on Computer Vision and Pattern Recognition Workshops (CVPRW).

[22]  Dario Farina,et al.  Reflections on the present and future of upper limb prostheses , 2016, Expert review of medical devices.

[23]  Christian Wolf,et al.  Multi-scale Deep Learning for Gesture Detection and Localization , 2014, ECCV Workshops.

[24]  Manfredo Atzori,et al.  Control Capabilities of Myoelectric Robotic Prostheses by Hand Amputees: A Scientific Research and Market Overview , 2015, Front. Syst. Neurosci..

[25]  Soo-Jin Lee,et al.  Current hand exoskeleton technologies for rehabilitation and assistive engineering , 2012 .

[26]  Yoshua Bengio,et al.  Show, Attend and Tell: Neural Image Caption Generation with Visual Attention , 2015, ICML.

[27]  Ling Shao,et al.  Deep Dynamic Neural Networks for Multimodal Gesture Segmentation and Recognition , 2016, IEEE Transactions on Pattern Analysis and Machine Intelligence.