Convolutional Neural Networks for Environmentally Aware Locomotion Mode Recognition of Lower-Limb Amputees

Powered lower-limb prostheses feature a high-level intelligent control system, referred to as locomotion mode recognition (LMR), which enables seamless amputee-prosthesis interactions through activation of appropriate low-level controllers depending on the user’s gait intent and environment. Environmental and terrain conditions provide valuable subject-independent prior information about the upcoming locomotion modes, which enable the design of seamless and non-delayed LMR systems. The objective of this paper is to validate the feasibility of deep convolutional neural networks (CNNs) for distinguishing three environmental conditions: level walking, stair ascent, and stair descent. The CNN automates feature learning and extraction that in traditional models were hand-engineered. We construct an efficient CNN through transfer learning from a pre-trained model where input images are captured from seven able-bodied subjects during various indoor and outdoor daily-life walking tasks. A stand still detection algorithm is developed by means of an inertial measurement unit sensor to automate the task of image capture. To further enhance prediction performance, we incorporate the history of previously predicted environmental conditions and the categorical information about the environment property (e.g., number of steps in a staircase). The proposed environment recognition system achieves an overall accuracy of about 99% on the test data. Results verify the potential of CNN to accurately predict the environmental conditions that can be used individually or in combination with other sensors to design an accurate and ∗Address all correspondence to this author. robust LMR system for lower-limb amputees with powered prostheses.

[1]  Levi J. Hargrove,et al.  Depth Sensing for Improved Control of Lower Limb Prostheses , 2015, IEEE Transactions on Biomedical Engineering.

[2]  Michael Goldfarb,et al.  Design and Control of a Powered Transfemoral Prosthesis , 2008, Int. J. Robotics Res..

[3]  Huseyin Atakan Varol,et al.  A feasibility study of depth image based intent recognition for lower limb prostheses , 2016, 2016 38th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC).

[4]  Tingfang Yan,et al.  A Locomotion Recognition System Using Depth Images , 2018, 2018 IEEE International Conference on Robotics and Automation (ICRA).

[5]  Huseyin Atakan Varol,et al.  User-Independent Intent Recognition for Lower Limb Prostheses Using Depth Sensing , 2018, IEEE Transactions on Biomedical Engineering.

[6]  Kimberly A. Ingraham,et al.  Delaying Ambulation Mode Transition Decisions Improves Accuracy of a Flexible Control System for Powered Knee-Ankle Prosthesis , 2017, IEEE Transactions on Neural Systems and Rehabilitation Engineering.

[7]  Xiangyu Zhang,et al.  ShuffleNet V2: Practical Guidelines for Efficient CNN Architecture Design , 2018, ECCV.

[8]  Dan Simon,et al.  Gradient-Based Multi-Objective Feature Selection for Gait Mode Recognition of Transfemoral Amputees , 2018, Sensors.

[9]  Ann M. Simon,et al.  A Training Method for Locomotion Mode Prediction Using Powered Lower Limb Prostheses , 2014, IEEE Transactions on Neural Systems and Rehabilitation Engineering.

[10]  Geoffrey E. Hinton,et al.  ImageNet classification with deep convolutional neural networks , 2012, Commun. ACM.

[11]  M. Goldfarb,et al.  Control of Stair Ascent and Descent With a Powered Transfemoral Prosthesis , 2013, IEEE Transactions on Neural Systems and Rehabilitation Engineering.

[12]  Guigang Zhang,et al.  Deep Learning , 2016, Int. J. Semantic Comput..

[13]  Fan Zhang,et al.  Continuous Locomotion-Mode Identification for Prosthetic Legs Based on Neuromuscular–Mechanical Fusion , 2011, IEEE Transactions on Biomedical Engineering.

[14]  He Huang,et al.  A Strategy for Identifying Locomotion Modes Using Surface Electromyography , 2009, IEEE Transactions on Biomedical Engineering.

[15]  Alexander Horsch,et al.  Separating Movement and Gravity Components in an Acceleration Signal and Implications for the Assessment of Human Daily Physical Activity , 2013, PloS one.

[16]  Levi J. Hargrove,et al.  A Classification Method for User-Independent Intent Recognition for Transfemoral Amputees Using Powered Lower Limb Prostheses , 2016, IEEE Transactions on Neural Systems and Rehabilitation Engineering.

[17]  Michael Goldfarb,et al.  A Stair Ascent and Descent Controller for a Powered Ankle Prosthesis , 2018, IEEE Transactions on Neural Systems and Rehabilitation Engineering.

[18]  Qiang Yang,et al.  A Survey on Transfer Learning , 2010, IEEE Transactions on Knowledge and Data Engineering.

[19]  Michael Goldfarb,et al.  Multiclass Real-Time Intent Recognition of a Powered Lower Limb Prosthesis , 2010, IEEE Transactions on Biomedical Engineering.

[20]  Kimberly A. Ingraham,et al.  Configuring a Powered Knee and Ankle Prosthesis for Transfemoral Amputees within Five Specific Ambulation Modes , 2014, PloS one.

[21]  Dan Simon,et al.  Optimal Mixed Tracking/Impedance Control With Application to Transfemoral Prostheses With Energy Regeneration , 2018, IEEE Transactions on Biomedical Engineering.

[22]  Fredrik Gustafsson,et al.  Probabilistic stand still detection using foot mounted IMU , 2010, 2010 13th International Conference on Information Fusion.

[23]  T. Schmalz,et al.  Energy expenditure and biomechanical characteristics of lower limb amputee gait: the influence of prosthetic alignment and different prosthetic components. , 2002, Gait & posture.

[24]  Nicholas P. Fey,et al.  Intent Recognition in a Powered Lower Limb Prosthesis Using Time History Information , 2013, Annals of Biomedical Engineering.

[25]  R. Waters,et al.  The energy expenditure of normal and pathologic gait. , 1999, Gait & posture.

[26]  François Chollet,et al.  Deep Learning with Python , 2017 .

[27]  Andrew Zisserman,et al.  Very Deep Convolutional Networks for Large-Scale Image Recognition , 2014, ICLR.

[28]  Gary Burnett,et al.  Translational Motion Tracking of Leg Joints for Enhanced Prediction of Walking Tasks , 2018, IEEE Transactions on Biomedical Engineering.

[29]  Qiang Chen,et al.  Network In Network , 2013, ICLR.

[30]  Yoshua Bengio,et al.  How transferable are features in deep neural networks? , 2014, NIPS.

[31]  Thang Nguyen,et al.  Robust Ground Reaction Force Estimation and Control of Lower-Limb Prostheses: Theory and Simulation , 2020, IEEE Transactions on Systems, Man, and Cybernetics: Systems.

[32]  Stéphane Mallat,et al.  Understanding deep convolutional networks , 2016, Philosophical Transactions of the Royal Society A: Mathematical, Physical and Engineering Sciences.

[33]  Ming Liu,et al.  Development of an Environment-Aware Locomotion Mode Recognition System for Powered Lower Limb Prostheses , 2016, IEEE Transactions on Neural Systems and Rehabilitation Engineering.