Real-time Indoor Dense 3D Reconstruction of Integration Inertial Measurements Into Surfels

With the continuous innovation and improvement of sensor technology and SLAM methods, visual SLAM methods have been able to achieve dense reconstruction of the surrounding environment, but when the optimization is poorly initialized, the visual SLAM method suffers from a lack of robustness in the tracking step. Inertial Measurement Unit (IMU) can quickly measure the angular velocity and acceleration of the sensor body. Inspired of this performance, we propose a method for tightly-coupled IMU data with camera data and ensure real-time performance on the GPU. Our method is based on the visual method ElasticFusion. When the global consistency is considered, the sensor velocity, pose, IMU biases are jointly optimized and add inertial information into maps, fully dense surfel-based 3D reconstruction of the environment. Through a large number of experiments, we have found that our vision-inertial navigation SLAM method is significantly better than pure-vision methods in terms of real-time reconstruction, closed-loop capability and map drift correction.

[1]  John J. Leonard,et al.  Kintinuous: Spatially Extended KinectFusion , 2012, AAAI 2012.

[2]  Juan D. Tardós,et al.  Probabilistic Semi-Dense Mapping from Highly Accurate Feature-Based Monocular SLAM , 2015, Robotics: Science and Systems.

[3]  Stergios I. Roumeliotis,et al.  A Multi-State Constraint Kalman Filter for Vision-aided Inertial Navigation , 2007, Proceedings 2007 IEEE International Conference on Robotics and Automation.

[4]  Roland Siegwart,et al.  Real-time onboard visual-inertial state estimation and self-calibration of MAVs in unknown environments , 2012, 2012 IEEE International Conference on Robotics and Automation.

[5]  Lu Ma,et al.  Large Scale Dense Visual Inertial SLAM , 2015, FSR.

[6]  Anastasios I. Mourikis,et al.  High-precision, consistent EKF-based visual-inertial odometry , 2013, Int. J. Robotics Res..

[7]  Marc Pollefeys,et al.  PIXHAWK: A system for autonomous flight using onboard computer vision , 2011, 2011 IEEE International Conference on Robotics and Automation.

[8]  Michael Bosse,et al.  Keyframe-based visual–inertial odometry using nonlinear optimization , 2015, Int. J. Robotics Res..

[9]  Wolfram Burgard,et al.  A benchmark for the evaluation of RGB-D SLAM systems , 2012, 2012 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[10]  Olaf Kähler,et al.  Very High Frame Rate Volumetric Integration of Depth Images on Mobile Devices , 2015, IEEE Transactions on Visualization and Computer Graphics.

[11]  Marc Pollefeys,et al.  Semi-direct EKF-based monocular visual-inertial odometry , 2015, 2015 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS).

[12]  Daniel Cremers,et al.  Dense visual SLAM for RGB-D cameras , 2013, 2013 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[13]  Stefan Leutenegger,et al.  ElasticFusion: Dense SLAM Without A Pose Graph , 2015, Robotics: Science and Systems.

[14]  Daniel Cremers,et al.  Camera-based navigation of a low-cost quadrocopter , 2012, 2012 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[15]  Marc Levoy,et al.  A volumetric method for building complex models from range images , 1996, SIGGRAPH.

[16]  Roland Siegwart,et al.  Dense visual-inertial navigation system for mobile robots , 2015, 2015 IEEE International Conference on Robotics and Automation (ICRA).

[17]  M. Pauly,et al.  Embedded deformation for shape manipulation , 2007, SIGGRAPH 2007.

[18]  Andrew W. Fitzgibbon,et al.  KinectFusion: Real-time dense surface mapping and tracking , 2011, 2011 10th IEEE International Symposium on Mixed and Augmented Reality.

[19]  Stefan Leutenegger,et al.  Dense RGB-D-inertial SLAM with map deformations , 2017, 2017 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS).

[20]  Roland Siegwart,et al.  Robust visual inertial odometry using a direct EKF-based approach , 2015, IROS 2015.