OpenVINS: A Research Platform for Visual-Inertial Estimation

In this paper, we present an open platform, termed OpenVINS, for visual-inertial estimation research for both the academic community and practitioners from industry. The open sourced codebase provides a foundation for researchers and engineers to quickly start developing new capabilities for their visual-inertial systems. This codebase has out of the box support for commonly desired visual-inertial estimation features, which include: (i) on-manifold sliding window Kalman filter, (ii) online camera intrinsic and extrinsic calibration, (iii) camera to inertial sensor time offset calibration, (iv) SLAM landmarks with different representations and consistent First-Estimates Jacobian (FEJ) treatments, (v) modular type system for state management, (vi) extendable visual-inertial system simulator, and (vii) extensive toolbox for algorithm evaluation. Moreover, we have also focused on detailed documentation and theoretical derivations to support rapid development and research, which are greatly lacked in the current open sourced algorithms. Finally, we perform comprehensive validation of the proposed OpenVINS against state-of-the-art open sourced algorithms, showing its competing estimation performance.

[1]  Michael Bosse,et al.  Keyframe-based visual–inertial odometry using nonlinear optimization , 2015, Int. J. Robotics Res..

[2]  Gamini Dissanayake,et al.  An invariant-EKF VINS algorithm for improving consistency , 2017, 2017 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS).

[3]  Shaojie Shen,et al.  VINS-Mono: A Robust and Versatile Monocular Visual-Inertial State Estimator , 2017, IEEE Transactions on Robotics.

[4]  Stergios I. Roumeliotis,et al.  Observability-based Rules for Designing Consistent EKF SLAM Estimators , 2010, Int. J. Robotics Res..

[5]  Roland Siegwart,et al.  Iterated extended Kalman filter based visual-inertial odometry using direct photometric feedback , 2017, Int. J. Robotics Res..

[6]  Stergios I. Roumeliotis,et al.  A First-Estimates Jacobian EKF for Improving SLAM Consistency , 2009, ISER.

[7]  N. Trawny,et al.  Indirect Kalman Filter for 3 D Attitude Estimation , 2005 .

[8]  Guoquan Huang,et al.  A Linear-Complexity EKF for Visual-Inertial Navigation with Loop Closures , 2019, 2019 International Conference on Robotics and Automation (ICRA).

[9]  Guoquan Huang,et al.  Visual-Inertial Odometry with Point and Line Features , 2019, 2019 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS).

[10]  Guoquan Huang,et al.  Closed-form preintegration methods for graph-based visual–inertial navigation , 2018, Int. J. Robotics Res..

[11]  Guoquan Huang,et al.  Sensor-Failure-Resilient Multi-IMU Visual-Inertial Navigation , 2019, 2019 International Conference on Robotics and Automation (ICRA).

[12]  Guoquan Huang,et al.  Degenerate Motion Analysis for Aided INS With Online Spatial and Temporal Sensor Calibration , 2019, IEEE Robotics and Automation Letters.

[13]  Yong Liu,et al.  Tightly-Coupled Aided Inertial Navigation with Point and Plane Features , 2019, 2019 International Conference on Robotics and Automation (ICRA).

[14]  Udo Frese,et al.  Integrating generic sensor fusion algorithms with sound state representations through encapsulation of manifolds , 2011, Inf. Fusion.

[15]  Roland Siegwart,et al.  The EuRoC micro aerial vehicle datasets , 2016, Int. J. Robotics Res..

[16]  Stergios I. Roumeliotis,et al.  A Multi-State Constraint Kalman Filter for Vision-aided Inertial Navigation , 2007, Proceedings 2007 IEEE International Conference on Robotics and Automation.

[17]  Guoquan Huang,et al.  Null-space-based marginalization: Analysis and algorithm , 2017, 2017 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS).

[18]  Davide Scaramuzza,et al.  Continuous-Time Visual-Inertial Odometry for Event Cameras , 2017, IEEE Transactions on Robotics.

[19]  Jörg Stückler,et al.  Visual-Inertial Mapping With Non-Linear Factor Recovery , 2019, IEEE Robotics and Automation Letters.

[20]  Vijay Kumar,et al.  Robust Stereo Visual Inertial Odometry for Fast Autonomous Flight , 2017, IEEE Robotics and Automation Letters.

[21]  Kevin Eckenhoff,et al.  Schmidt-EKF-based Visual-Inertial Moving Object Tracking , 2020, 2020 IEEE International Conference on Robotics and Automation (ICRA).

[22]  Guoquan Huang,et al.  Robocentric visual–inertial odometry , 2018, 2018 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS).

[23]  Gabe Sibley,et al.  A Spline-Based Trajectory Representation for Sensor Fusion and Rolling Shutter Cameras , 2015, International Journal of Computer Vision.

[24]  Guoquan Huang,et al.  Multi-Camera Visual-Inertial Navigation with Online Intrinsic and Extrinsic Calibration , 2019, 2019 International Conference on Robotics and Automation (ICRA).

[25]  Mingyang Li,et al.  Visual-Inertial Odometry on Resource-Constrained Systems , 2014 .

[26]  Anastasios I. Mourikis,et al.  Online temporal calibration for camera–IMU systems: Theory and algorithms , 2014, Int. J. Robotics Res..

[27]  Roland Siegwart,et al.  Maplab: An Open Framework for Research in Visual-Inertial Mapping and Localization , 2017, IEEE Robotics and Automation Letters.

[28]  Guoquan Huang,et al.  Tightly-Coupled Visual-Inertial Localization and 3-D Rigid-Body Target Tracking , 2019, IEEE Robotics and Automation Letters.

[29]  Anastasios I. Mourikis,et al.  High-fidelity sensor modeling and self-calibration in vision-aided inertial navigation , 2014, 2014 IEEE International Conference on Robotics and Automation (ICRA).

[30]  Stergios I. Roumeliotis,et al.  Stochastic cloning: a generalized framework for processing relative state measurements , 2002, Proceedings 2002 IEEE International Conference on Robotics and Automation (Cat. No.02CH37292).

[31]  Shaojie Shen,et al.  A General Optimization-based Framework for Local Odometry Estimation with Multiple Sensors , 2019, ArXiv.

[32]  F. Dellaert Factor Graphs and GTSAM: A Hands-on Introduction , 2012 .

[33]  Hujun Bao,et al.  ICE-BA: Incremental, Consistent and Efficient Bundle Adjustment for Visual-Inertial SLAM , 2018, 2018 IEEE/CVF Conference on Computer Vision and Pattern Recognition.

[34]  Guoquan Huang,et al.  An Efficient Schmidt-EKF for 3D Visual-Inertial SLAM , 2019, 2019 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR).

[35]  Guoquan Huang,et al.  Visual-Inertial Localization With Prior LiDAR Map Constraints , 2019, IEEE Robotics and Automation Letters.

[36]  Javier Civera,et al.  Inverse Depth Parametrization for Monocular SLAM , 2008, IEEE Transactions on Robotics.

[37]  Guoquan Huang,et al.  Visual-Inertial Navigation: A Concise Review , 2019, 2019 International Conference on Robotics and Automation (ICRA).

[38]  Anastasios I. Mourikis,et al.  Optimization-Based Estimator Design for Vision-Aided Inertial Navigation , 2012, Robotics: Science and Systems.

[39]  Joel A. Hesch,et al.  A comparative analysis of tightly-coupled monocular, binocular, and stereo VINS , 2017, 2017 IEEE International Conference on Robotics and Automation (ICRA).

[40]  Stergios I. Roumeliotis,et al.  Analysis and improvement of the consistency of extended Kalman filter based SLAM , 2008, 2008 IEEE International Conference on Robotics and Automation.