Directional PointNet: 3D Environmental Classification for Wearable Robotics

Environmental information can provide reliable prior information about human motion intent, which can aid the subject with wearable robotics to walk in complex environments. Previous researchers have utilized 1D signal and 2D images to classify environments, but they may face the problems of self-occlusion. Comparatively, 3D point cloud can be more appropriate to depict environments, thus we propose a directional PointNet to classify 3D point cloud directly. By utilizing the orientation information of the point cloud, the directional PointNet can classify daily terrains, including level ground, up stairs, and down stairs, and the classification accuracy achieves 99% for testing set. Moreover, the directional PointNet is more efficient than the previous PointNet because the T-net, which is utilized to estimate the transformation of the point cloud, is removed in this research and the length of the global feature is optimized. The experimental results demonstrate that the directional PointNet can classify the environments robustly and efficiently.

[1]  Gary Burnett,et al.  Translational Motion Tracking of Leg Joints for Enhanced Prediction of Walking Tasks , 2018, IEEE Transactions on Biomedical Engineering.

[2]  Daniel P. Ferris,et al.  State of the Art and Future Directions for Lower Limb Robotic Exoskeletons , 2017, IEEE Transactions on Neural Systems and Rehabilitation Engineering.

[3]  Michael Goldfarb,et al.  A Phase Variable Approach for IMU-Based Locomotion Activity Recognition , 2018, IEEE Transactions on Biomedical Engineering.

[4]  Huseyin Atakan Varol,et al.  User-Independent Intent Recognition for Lower Limb Prostheses Using Depth Sensing , 2018, IEEE Transactions on Biomedical Engineering.

[5]  Tingfang Yan,et al.  Review of assistive strategies in powered lower-limb orthoses and exoskeletons , 2015, Robotics Auton. Syst..

[6]  Qining Wang,et al.  Real-Time On-Board Recognition of Continuous Locomotion Modes for Amputees With Robotic Transtibial Prostheses , 2018, IEEE Transactions on Neural Systems and Rehabilitation Engineering.

[7]  Shriya S Srinivasan,et al.  Proprioception from a neurally controlled lower-extremity prosthesis , 2018, Science Translational Medicine.

[8]  Yoshiyuki Sankai,et al.  Power assist method based on Phase Sequence and muscle force condition for HAL , 2005, Adv. Robotics.

[9]  Wen Zhang,et al.  Environmental Features Recognition for Lower Limb Prostheses Toward Predictive Walking , 2019, IEEE Transactions on Neural Systems and Rehabilitation Engineering.

[10]  Jianxiong Xiao,et al.  3D ShapeNets: A deep representation for volumetric shapes , 2014, 2015 IEEE Conference on Computer Vision and Pattern Recognition (CVPR).

[11]  Kathryn Ziegler-Graham,et al.  Estimating the prevalence of limb loss in the United States: 2005 to 2050. , 2008, Archives of physical medicine and rehabilitation.

[12]  Janice J Eng,et al.  Incidence of lower limb amputation in Canada , 2017, Canadian Journal of Public Health.

[13]  Subhransu Maji,et al.  Multi-view Convolutional Neural Networks for 3D Shape Recognition , 2015, 2015 IEEE International Conference on Computer Vision (ICCV).

[14]  Mary M. Hayhoe,et al.  Gaze and the Control of Foot Placement When Walking in Natural Terrain , 2018, Current Biology.

[15]  Sebastian Scherer,et al.  VoxNet: A 3D Convolutional Neural Network for real-time object recognition , 2015, 2015 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS).

[16]  Edgar J. Lobaton,et al.  Visual Terrain Identification and Surface Inclination Estimation for Improving Human Locomotion with a Lower-Limb Prosthetic , 2018, 2018 40th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC).

[17]  Michael Goldfarb,et al.  Multiclass Real-Time Intent Recognition of a Powered Lower Limb Prosthesis , 2010, IEEE Transactions on Biomedical Engineering.

[18]  Ming Liu,et al.  Development of an Environment-Aware Locomotion Mode Recognition System for Powered Lower Limb Prostheses , 2016, IEEE Transactions on Neural Systems and Rehabilitation Engineering.

[19]  Cheng-Long Fu,et al.  Real-time Terrain Mode Recognition Module for the Control of Lower-limb Prosthesis , 2018 .

[20]  Hugh M. Herr,et al.  Powered Ankle--Foot Prosthesis Improves Walking Metabolic Economy , 2009, IEEE Transactions on Robotics.

[21]  Clarence W. de Silva,et al.  Sensor Systems: Fundamentals and Applications , 2016 .

[22]  Stephanie Huang,et al.  Voluntary Control of Residual Antagonistic Muscles in Transtibial Amputees: Feedforward Ballistic Contractions and Implications for Direct Neural Control of Powered Lower Limb Prostheses , 2018, IEEE Transactions on Neural Systems and Rehabilitation Engineering.

[23]  Levi J. Hargrove,et al.  Depth Sensing for Improved Control of Lower Limb Prostheses , 2015, IEEE Transactions on Biomedical Engineering.

[24]  Leonidas J. Guibas,et al.  Frustum PointNets for 3D Object Detection from RGB-D Data , 2017, 2018 IEEE/CVF Conference on Computer Vision and Pattern Recognition.

[25]  Leonidas J. Guibas,et al.  PointNet: Deep Learning on Point Sets for 3D Classification and Segmentation , 2016, 2017 IEEE Conference on Computer Vision and Pattern Recognition (CVPR).

[26]  Leonidas J. Guibas,et al.  PointNet++: Deep Hierarchical Feature Learning on Point Sets in a Metric Space , 2017, NIPS.