Human Motion Capture Based on Kinect and IMUs and Its Application to Human-Robot Collaboration

In this paper, a nonlinear optimization solver is proposed to establish the human upper body motion capture system, including the orientation of each bone joint and the global position of the hip bone. 5 IMUs and one Kinect are used, the rotation data and acceleration data of IMUs, Kinect skeleton position data and human pose prior are fused in a non-linear optimization way. The method proposed in this paper combines the characteristics of Kinect data with high accuracy and IMU data with stability, which can make up for the accuracy of human pose capture in dynamic, magnetically disturbed and occlusion conditions. In order to verify the practicability of the human motion capture system proposed in this paper, the system is firstly applied to real-time human motion capture, then a simple human-robot collaboration (HRC) case is implemented, which is very common in modern collaborative medical field. The experimental results show that the proposed human motion capture system can achieve stable human motion measurement, which can meet the needs of HRC scenarios.

[1]  Hang Su,et al.  Deep Neural Network Approach in Human-Like Redundancy Optimization for Anthropomorphic Manipulators , 2019, IEEE Access.

[2]  Alois Knoll,et al.  Towards Model-Free Tool Dynamic Identification and Calibration Using Multi-Layer Neural Network , 2019, Sensors.

[3]  Hang Su,et al.  Neural Network Enhanced Robot Tool Identification and Calibration for Bilateral Teleoperation , 2019, IEEE Access.

[4]  Tao Yu,et al.  HybridFusion: Real-Time Performance Capture Using a Single Depth Sensor and Sparse IMUs , 2018, ECCV.

[5]  Chenguang Yang,et al.  Physical Human–Robot Interaction of a Robotic Exoskeleton By Admittance Control , 2018, IEEE Transactions on Industrial Electronics.

[6]  Charles Malleson,et al.  Fusing Visual and Inertial Sensors with Semantics for 3D Human Pose Estimation , 2018, International Journal of Computer Vision.

[7]  Yaser Sheikh,et al.  Total Capture: A 3D Deformation Model for Tracking Faces, Hands, and Bodies , 2018, 2018 IEEE/CVF Conference on Computer Vision and Pattern Recognition.

[8]  Charles Malleson,et al.  Total Capture: 3D Human Pose Estimation Fusing Video and Inertial Sensors , 2017, BMVC.

[9]  Bodo Rosenhahn,et al.  Multisensor-fusion for 3D full-body human motion capture , 2010, 2010 IEEE Computer Society Conference on Computer Vision and Pattern Recognition.

[10]  J. Collomosse,et al.  Real-Time Full-Body Motion Capture from Video and IMUs , 2017, 2017 International Conference on 3D Vision (3DV).

[11]  H Ghasemzadeh,et al.  Coordination Analysis of Human Movements With Body Sensor Networks: A Signal Processing Model to Evaluate Baseball Swings , 2011, IEEE Sensors Journal.

[12]  D. Roetenberg,et al.  Xsens MVN: Full 6DOF Human Motion Tracking Using Miniature Inertial Sensors , 2009 .

[13]  Junliang Huang,et al.  Hybrid Brain/Muscle Signals Powered Wearable Walking Exoskeleton Enhancing Motor Ability in Climbing Stairs Activity , 2019, IEEE Transactions on Medical Robotics and Bionics.

[14]  Peter Langendörfer,et al.  Detecting Elementary Arm Movements by Tracking Upper Limb Joint Angles With MARG Sensors , 2016, IEEE Journal of Biomedical and Health Informatics.

[15]  Bodo Rosenhahn,et al.  Sparse Inertial Poser: Automatic 3D Human Pose Estimation from Sparse IMUs , 2017, Comput. Graph. Forum.

[16]  P. Veltink,et al.  Estimating orientation with gyroscopes and accelerometers. , 1999, Technology and health care : official journal of the European Society for Engineering and Medicine.

[17]  Hang Su,et al.  Improved Human–Robot Collaborative Control of Redundant Robot for Teleoperated Minimally Invasive Surgery , 2019, IEEE Robotics and Automation Letters.

[18]  Hans-Peter Seidel,et al.  Real-Time Body Tracking with One Depth Camera and Inertial Sensors , 2013, 2013 IEEE International Conference on Computer Vision.