Observability-Aware Self-Calibration of Visual and Inertial Sensors for Ego-Motion Estimation

External effects such as shocks and temperature variations affect the calibration of visual–inertial sensor systems and thus they cannot fully rely on factory calibrations. Re-calibrations performed on short user-collected datasets might yield poor performance since the observability of certain parameters is highly dependent on the motion. In addition, on resource-constrained systems (e.g., mobile phones), full-batch approaches over longer sessions quickly become prohibitively expensive. In this paper, we approach the self-calibration problem by introducing information theoretic metrics to assess the information content of trajectory segments, thus allowing to select the most informative parts from a dataset for calibration purposes. With this approach, we are able to build compact calibration datasets either: 1) by selecting segments from a long session with limited exciting motion or 2) from multiple short sessions where a single session does not necessarily excite all modes sufficiently. Real-world experiments in four different environments show that the proposed method achieves comparable performance to a batch calibration approach, yet, at a constant computational complexity which is independent of the duration of the session.

[1]  Zhengyou Zhang,et al.  A Flexible New Technique for Camera Calibration , 2000, IEEE Trans. Pattern Anal. Mach. Intell..

[2]  O. Faugeras,et al.  Straight lines have to be straight , 2001, Machine Vision and Applications.

[3]  Bernhard P. Wrobel,et al.  Multiple View Geometry in Computer Vision , 2001 .

[4]  Jorge Lobo,et al.  Camera-Inertial Sensor modelling and alignment for Visual Navigation , 2003 .

[5]  N. Trawny,et al.  Indirect Kalman Filter for 3 D Attitude Estimation , 2005 .

[6]  Y. Oshman,et al.  Averaging Quaternions , 2007 .

[7]  Jorge Dias,et al.  Relative Pose Calibration Between Visual and Inertial Sensors , 2007, Int. J. Robotics Res..

[8]  Oliver J. Woodman,et al.  An introduction to inertial navigation , 2007 .

[9]  Stergios I. Roumeliotis,et al.  A Multi-State Constraint Kalman Filter for Vision-aided Inertial Navigation , 2007, Proceedings 2007 IEEE International Conference on Robotics and Automation.

[10]  Magnus Jansson,et al.  Joint calibration of an inertial measurement unit and coordinate transformation parameters using a monocular camera , 2010, 2010 International Conference on Indoor Positioning and Indoor Navigation.

[11]  Gaurav S. Sukhatme,et al.  Visual-Inertial Sensor Fusion: Localization, Mapping and Sensor-to-Sensor Self-calibration , 2011, Int. J. Robotics Res..

[12]  Anastasios I. Mourikis,et al.  High-precision, consistent EKF-based visual-inertial odometry , 2013, Int. J. Robotics Res..

[13]  Roland Siegwart,et al.  Unified temporal and spatial calibration for multi-sensor systems , 2013, 2013 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[14]  Yuanxin Wu,et al.  On 'A Kalman Filter-Based Algorithm for IMU-Camera Calibration: Observability Analysis and Performance Evaluation' , 2013, ArXiv.

[15]  R. Siegwart,et al.  Self-supervised calibration for robotic systems , 2013, 2013 IEEE Intelligent Vehicles Symposium (IV).

[16]  Edwin Olson,et al.  AprilCal: Assisted and repeatable camera calibration , 2013, 2013 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[17]  Anastasios I. Mourikis,et al.  High-fidelity sensor modeling and self-calibration in vision-aided inertial navigation , 2014, 2014 IEEE International Conference on Robotics and Automation (ICRA).

[18]  Gabe Sibley,et al.  Constant-time monocular self-calibration , 2014, 2014 IEEE International Conference on Robotics and Biomimetics (ROBIO 2014).

[19]  Anastasios I. Mourikis,et al.  Online temporal calibration for camera–IMU systems: Theory and algorithms , 2014, Int. J. Robotics Res..

[20]  Gabe Sibley,et al.  A Spline-Based Trajectory Representation for Sensor Fusion and Rolling Shutter Cameras , 2015, International Journal of Computer Vision.

[21]  Michael Bosse,et al.  Keyframe-based visual–inertial odometry using nonlinear optimization , 2015, Int. J. Robotics Res..

[22]  Roland Siegwart,et al.  Robust visual inertial odometry using a direct EKF-based approach , 2015, IROS 2015.

[23]  Juan I. Nieto,et al.  Non-Parametric Extrinsic and Intrinsic Calibration of Visual-Inertial Sensor Systems , 2016, IEEE Sensors Journal.

[24]  Roland Siegwart,et al.  Extending kalibr: Calibrating the extrinsics of multiple IMUs and of individual axes , 2016, 2016 IEEE International Conference on Robotics and Automation (ICRA).

[25]  Fernando Nobre,et al.  Multi-Sensor SLAM with Online Self-Calibration and Change Detection , 2016, ISER.

[26]  Davide Scaramuzza,et al.  Benefit of large field-of-view cameras for visual odometry , 2016, 2016 IEEE International Conference on Robotics and Automation (ICRA).

[27]  Gaurav S. Sukhatme,et al.  Observability-Aware Trajectory Optimization for Self-Calibration With Application to UAVs , 2016, IEEE Robotics and Automation Letters.

[28]  Roland Siegwart,et al.  Sampling-based motion planning for active multirotor system identification , 2017, 2017 IEEE International Conference on Robotics and Automation (ICRA).

[29]  Roland Siegwart,et al.  Visual-inertial self-calibration on informative motion segments , 2017, 2017 IEEE International Conference on Robotics and Automation (ICRA).

[30]  Gaurav S. Sukhatme,et al.  Trajectory Optimization for Self-Calibration and Navigation , 2017, Robotics: Science and Systems.

[31]  Shaojie Shen,et al.  VINS-Mono: A Robust and Versatile Monocular Visual-Inertial State Estimator , 2017, IEEE Transactions on Robotics.

[32]  Roland Siegwart,et al.  Maplab: An Open Framework for Research in Visual-Inertial Mapping and Localization , 2017, IEEE Robotics and Automation Letters.