Dense visual-inertial odometry for tracking of aggressive motions

We propose a sliding window-based dense visual-inertial fusion method for real-time tracking of challenging aggressive motions. Our method combines recent advances in direct dense visual odometry, inertial measurement unit (IMU) preintegration, and graph-based optimization. At the front-end, direct dense visual odometry provides camera pose tracking that is resistant to motion blur. At the back-end, a sliding window optimization-based fusion framework with efficient IMU preintegration generates smooth and high-accuracy state estimates, even with occasional visual tracking failures. A local loop closure that is integrated into the back-end further eliminates drift after extremely aggressive motions. Our system runs real-time at 25 Hz on an off-the-shelf laptop. Experimental results show that our method is able to accurately track motions with angular velocities up to 1000 degrees/s and velocities up to 4 m/s. We also compare our method with state-of-the-art systems, such as Google Tango, and show superior performance during challenging motions. We show that our method achieves reliable tracking results, even if we throw the sensor suite during experiments.

[1]  Dimitrios G. Kottas,et al.  Consistency Analysis and Improvement of Vision-aided Inertial Navigation , 2014, IEEE Transactions on Robotics.

[2]  Vijay Kumar,et al.  Vision-Based State Estimation and Trajectory Control Towards High-Speed Flight with a Quadrotor , 2013, Robotics: Science and Systems.

[3]  Patrick Rives,et al.  Real-time Quadrifocal Visual Odometry , 2010, Int. J. Robotics Res..

[4]  Roland Siegwart,et al.  Vision-Controlled Micro Flying Robots: From System Design to Autonomous Navigation and Mapping in GPS-Denied Environments , 2014, IEEE Robotics & Automation Magazine.

[5]  G. Klein,et al.  Parallel Tracking and Mapping for Small AR Workspaces , 2007, 2007 6th IEEE and ACM International Symposium on Mixed and Augmented Reality.

[6]  Vijay Kumar,et al.  Initialization-Free Monocular Visual-Inertial State Estimation with Application to Autonomous MAVs , 2014, ISER.

[7]  Daniel Cremers,et al.  Semi-dense Visual Odometry for a Monocular Camera , 2013, 2013 IEEE International Conference on Computer Vision.

[8]  Roland Siegwart,et al.  Dense visual-inertial navigation system for mobile robots , 2015, 2015 IEEE International Conference on Robotics and Automation (ICRA).

[9]  Daniel Cremers,et al.  Robust odometry estimation for RGB-D cameras , 2013, 2013 IEEE International Conference on Robotics and Automation.

[10]  S. Shankar Sastry,et al.  An Invitation to 3-D Vision: From Images to Geometric Models , 2003 .

[11]  Anastasios I. Mourikis,et al.  High-precision, consistent EKF-based visual-inertial odometry , 2013, Int. J. Robotics Res..

[12]  Salah Sukkarieh,et al.  Visual-Inertial-Aided Navigation for High-Dynamic Motion in Built Environments Without Initial Conditions , 2012, IEEE Transactions on Robotics.

[13]  Roland Siegwart,et al.  Keyframe-Based Visual-Inertial SLAM using Nonlinear Optimization , 2013, Robotics: Science and Systems.

[14]  Gaurav S. Sukhatme,et al.  Sliding window filter with application to planetary landing , 2010, J. Field Robotics.

[15]  Daniel Cremers,et al.  LSD-SLAM: Large-Scale Direct Monocular SLAM , 2014, ECCV.

[16]  Roland Siegwart,et al.  A synchronized visual-inertial sensor system with FPGA pre-processing for accurate real-time SLAM , 2014, ICRA 2014.

[17]  Albert S. Huang,et al.  Visual Odometry and Mapping for Autonomous Flight Using an RGB-D Camera , 2011, ISRR.

[18]  Vijay Kumar,et al.  Tightly-coupled monocular visual-inertial fusion for autonomous flight of rotorcraft MAVs , 2015, 2015 IEEE International Conference on Robotics and Automation (ICRA).

[19]  Andrew J. Davison,et al.  DTAM: Dense tracking and mapping in real-time , 2011, 2011 International Conference on Computer Vision.

[20]  Carlo Tomasi,et al.  Good features to track , 1994, 1994 Proceedings of IEEE Conference on Computer Vision and Pattern Recognition.