Visual-inertial self-calibration on informative motion segments

Environmental conditions and external effects, such as shocks, have a significant impact on the calibration parameters of visual-inertial sensor systems. Thus long-term operation of these systems cannot fully rely on factory calibration. Since the observability of certain parameters is highly dependent on the motion of the device, using short data segments at device initialization may yield poor results. When such systems are additionally subject to energy constraints, it is also infeasible to use full-batch approaches on a big dataset and careful selection of the data is of high importance. In this paper, we present a novel approach for resource efficient self-calibration of visual-inertial sensor systems. This is achieved by casting the calibration as a segment-based optimization problem that can be run on a small subset of informative segments. Consequently, the computational burden is limited as only a predefined number of segments is used. We also propose an efficient information-theoretic selection to identify such informative motion segments. In evaluations on a challenging dataset, we show our approach to significantly outperform state-of-the-art in terms of computational burden while maintaining a comparable accuracy.

[1]  Anastasios I. Mourikis,et al.  High-fidelity sensor modeling and self-calibration in vision-aided inertial navigation , 2014, 2014 IEEE International Conference on Robotics and Automation (ICRA).

[2]  Juan I. Nieto,et al.  Non-Parametric Extrinsic and Intrinsic Calibration of Visual-Inertial Sensor Systems , 2016, IEEE Sensors Journal.

[3]  Roland Siegwart,et al.  Robust visual inertial odometry using a direct EKF-based approach , 2015, IROS 2015.

[4]  Yuanxin Wu,et al.  On 'A Kalman Filter-Based Algorithm for IMU-Camera Calibration: Observability Analysis and Performance Evaluation' , 2013, ArXiv.

[5]  Gaurav S. Sukhatme,et al.  Observability-Aware Trajectory Optimization for Self-Calibration With Application to UAVs , 2016, IEEE Robotics and Automation Letters.

[6]  Y. Oshman,et al.  Averaging Quaternions , 2007 .

[7]  N. Trawny,et al.  Indirect Kalman Filter for 3 D Attitude Estimation , 2005 .

[8]  R. Siegwart,et al.  Self-supervised calibration for robotic systems , 2013, 2013 IEEE Intelligent Vehicles Symposium (IV).

[9]  Michael Bosse,et al.  Get Out of My Lab: Large-scale, Real-Time Visual-Inertial Localization , 2015, Robotics: Science and Systems.

[10]  Olivier D. Faugeras,et al.  Automatic calibration and removal of distortion from scenes of structured environments , 1995, Optics & Photonics.

[11]  Anastasios I. Mourikis,et al.  High-precision, consistent EKF-based visual-inertial odometry , 2013, Int. J. Robotics Res..

[12]  Zhengyou Zhang,et al.  A Flexible New Technique for Camera Calibration , 2000, IEEE Trans. Pattern Anal. Mach. Intell..

[13]  Roland Siegwart,et al.  Extending kalibr: Calibrating the extrinsics of multiple IMUs and of individual axes , 2016, 2016 IEEE International Conference on Robotics and Automation (ICRA).

[14]  Stergios I. Roumeliotis,et al.  C-KLAM: Constrained keyframe-based localization and mapping , 2014, 2014 IEEE International Conference on Robotics and Automation (ICRA).

[15]  Edward Jones,et al.  Equidistant Fish-Eye Calibration and Rectification by Vanishing Point Extraction , 2010, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[16]  Gabe Sibley,et al.  Constant-time monocular self-calibration , 2014, 2014 IEEE International Conference on Robotics and Biomimetics (ROBIO 2014).