High altitude monocular visual-inertial state estimation: Initialization and sensor fusion

Obtaining reliable state estimates at high altitude but GPS-denied environments, such as between high-rise buildings or in the middle of deep canyons, is known to be challenging, due to the lack of direct distance measurements. Monocular visual-inertial systems provide a possible way to recover the metric distance through proper integration of visual and inertial measurements. However, the nonlinear optimization problem for state estimation suffers from poor numerical conditioning or even degeneration, due to difficulties in obtaining observations of visual features with sufficient parallax, and the excessive period of inertial measurement integration. In this paper, we propose a spline-based high altitude estimator initialization method for monocular visual-inertial navigation system (VINS) with special attention to the numerical issues. Our formulation takes only inertial measurements that contain sufficient excitation, and drops uninformative measurements such as those obtained during hovering. In addition, our method explicitly reduces the number of parameters to be estimated in order to achieve earlier convergence. Based on the initialization results, a complete closed-loop system is constructed for high altitude navigation. Extensive experiments are conducted to validate our approach.

[1]  Gabe Sibley,et al.  Spline Fusion: A continuous-time representation for visual-inertial fusion with application to rolling shutter cameras , 2013, BMVC.

[2]  Roland Siegwart,et al.  Real-time metric state estimation for modular vision-inertial systems , 2011, 2011 IEEE International Conference on Robotics and Automation.

[3]  Patrick Rives,et al.  Single View Point Omnidirectional Camera Calibration from Planar Grids , 2007, Proceedings 2007 IEEE International Conference on Robotics and Automation.

[4]  Flavio Fontana,et al.  Simultaneous State Initialization and Gyroscope Bias Calibration in Visual Inertial Aided Navigation , 2017, IEEE Robotics and Automation Letters.

[5]  Vijay Kumar,et al.  Minimum snap trajectory generation and control for quadrotors , 2011, 2011 IEEE International Conference on Robotics and Automation.

[6]  Javier Civera,et al.  Inverse Depth Parametrization for Monocular SLAM , 2008, IEEE Transactions on Robotics.

[7]  Michael Bosse,et al.  Keyframe-based visual–inertial odometry using nonlinear optimization , 2015, Int. J. Robotics Res..

[8]  G. Klein,et al.  Parallel Tracking and Mapping for Small AR Workspaces , 2007, 2007 6th IEEE and ACM International Symposium on Mixed and Augmented Reality.

[9]  Vijay Kumar,et al.  Initialization-Free Monocular Visual-Inertial State Estimation with Application to Autonomous MAVs , 2014, ISER.

[10]  Roland Siegwart,et al.  A robust and modular multi-sensor fusion approach applied to MAV navigation , 2013, 2013 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[11]  Vijay Kumar,et al.  Multi-sensor fusion for robust autonomous flight in indoor and outdoor environments with a rotorcraft MAV , 2014, 2014 IEEE International Conference on Robotics and Automation (ICRA).

[12]  Stergios I. Roumeliotis,et al.  A Multi-State Constraint Kalman Filter for Vision-aided Inertial Navigation , 2007, Proceedings 2007 IEEE International Conference on Robotics and Automation.

[13]  Roland Siegwart,et al.  Deterministic initialization of metric state estimation filters for loosely-coupled monocular vision-inertial systems , 2011, 2011 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[14]  Shaojie Shen,et al.  Monocular Visual–Inertial State Estimation With Online Initialization and Camera–IMU Extrinsic Calibration , 2017, IEEE Transactions on Automation Science and Engineering.

[15]  Salah Sukkarieh,et al.  Real-time implementation of airborne inertial-SLAM , 2007, Robotics Auton. Syst..

[16]  Ben Upcroft,et al.  High Altitude Stereo Visual Odometry , 2013, Robotics: Science and Systems.

[17]  Anastasios I. Mourikis,et al.  Estimator initialization in vision-aided inertial navigation with unknown camera-IMU calibration , 2012, 2012 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[18]  Aníbal Ollero,et al.  Vision-Based Odometry and SLAM for Medium and High Altitude Flying UAVs , 2009, J. Intell. Robotic Syst..

[19]  Rafik Mebarki,et al.  Closed-form solution for absolute scale velocity estimation using visual and inertial data with a sliding least-squares estimation , 2013, 21st Mediterranean Conference on Control and Automation.

[20]  Agostino Martinelli,et al.  Closed-Form Solution of Visual-Inertial Structure from Motion , 2013, International Journal of Computer Vision.

[21]  Stergios I. Roumeliotis,et al.  Stochastic cloning: a generalized framework for processing relative state measurements , 2002, Proceedings 2002 IEEE International Conference on Robotics and Automation (Cat. No.02CH37292).

[22]  Roland Siegwart,et al.  Closed-form solution for absolute scale velocity determination combining inertial measurements and a single feature correspondence , 2011, 2011 IEEE International Conference on Robotics and Automation.

[23]  Frank Dellaert,et al.  Information fusion in navigation systems via factor graph based incremental smoothing , 2013, Robotics Auton. Syst..

[24]  Friedrich Fraundorfer,et al.  Visual Odometry Part I: The First 30 Years and Fundamentals , 2022 .

[25]  Roland Siegwart,et al.  Infrastructure-based calibration of a multi-camera rig , 2014, 2014 IEEE International Conference on Robotics and Automation (ICRA).