Research into Kinect/Inertial Measurement Units Based on Indoor Robots

As indoor mobile navigation suffers from low positioning accuracy and accumulation error, we carried out research into an integrated location system for a robot based on Kinect and an Inertial Measurement Unit (IMU). In this paper, the close-range stereo images are used to calculate the attitude information and the translation amount of the adjacent positions of the robot by means of the absolute orientation algorithm, for improving the calculation accuracy of the robot’s movement. Relying on the Kinect visual measurement and the strap-down IMU devices, we also use Kalman filtering to obtain the errors of the position and attitude outputs, in order to seek the optimal estimation and correct the errors. Experimental results show that the proposed method is able to improve the positioning accuracy and stability of the indoor mobile robot.

[1]  Dongseok Ryu,et al.  Multiple Intensity Differentiation for 3-D Surface Reconstruction With Mono-Vision Infrared Proximity Array Sensor , 2011, IEEE Sensors Journal.

[2]  Tianmiao Wang,et al.  Monocular vision and IMU based navigation for a small unmanned helicopter , 2012, 2012 7th IEEE Conference on Industrial Electronics and Applications (ICIEA).

[3]  Kelly J. Bower,et al.  Concurrent validity of the Microsoft Kinect for assessment of spatiotemporal gait variables. , 2013, Journal of biomechanics.

[4]  J. Mark,et al.  Extension of strapdown attitude algorithm for high-frequency base motion , 1988 .

[5]  Xinsheng Huang,et al.  Observability analysis of navigation system using point-based visual and inertial sensors , 2014 .

[6]  Michael Hayes,et al.  Altitude control of a quadrotor helicopter using depth map from Microsoft Kinect sensor , 2011, 2011 IEEE International Conference on Mechatronics.

[7]  Takeo Kanade,et al.  A visual odometer for autonomous helicopter flight , 1999, Robotics Auton. Syst..

[8]  Robin B. Miller A new strapdown attitude algorithm , 1983 .

[9]  J. Bortz A New Mathematical Formulation for Strapdown Inertial Navigation , 1971, IEEE Transactions on Aerospace and Electronic Systems.

[10]  Jan-Michael Frahm,et al.  A Comparative Analysis of RANSAC Techniques Leading to Adaptive Real-Time Random Sample Consensus , 2008, ECCV.

[11]  Jianfeng Huang,et al.  A Robot Collision Avoidance Method Using Kinect and Global Vision , 2017 .

[12]  Nuno Lau,et al.  Using a Depth Camera for Indoor Robot Localization and Navigation , 2011 .

[13]  Shuxiang Guo,et al.  A Kinect-Based Real-Time Compressive Tracking Prototype System for Amphibious Spherical Robots , 2015, Sensors.

[14]  A. Aydin Alatan,et al.  Loosely coupled Kalman filtering for fusion of Visual Odometry and inertial navigation , 2013, Proceedings of the 16th International Conference on Information Fusion.

[15]  Alper Yilmaz,et al.  REAL-TIME LARGE SCALE 3D RECONSTRUCTION BY FUSING KINECT AND IMU DATA , 2015, ISPRS Annals of the Photogrammetry, Remote Sensing and Spatial Information Sciences.

[16]  G LoweDavid,et al.  Distinctive Image Features from Scale-Invariant Keypoints , 2004 .

[17]  Hongkai Zhou 30.A Study of Movement Attitude Capture System Based on Kinect , 2017 .