This paper introduces a new method for fast calibration of inertial measurement units (IMU) with cameras being rigidly
coupled. That is, the relative rotation and translation between the IMU and the camera is estimated, allowing for the
transfer of IMU data to the cameras coordinate frame. Moreover, the IMUs nuisance parameters (biases and scales) and
the horizontal alignment of the initial camera frame are determined. Since an iterated Kalman Filter is used for estimation,
information on the estimations precision is also available. Such calibrations are crucial for IMU-aided visual robot
navigation, i.e. SLAM, since wrong calibrations cause biases and drifts in the estimated position and orientation. As the
estimation is performed in realtime, the calibration can be done using a freehand movement and the estimated parameters
can be validated just in time. This provides the opportunity of optimizing the used trajectory online, increasing the quality
and minimizing the time effort for calibration. Except for a marker pattern, used for visual tracking, no additional hardware
is required.
As will be shown, the system is capable of estimating the calibration within a short period of time. Depending on
the requested precision trajectories of 30 seconds to a few minutes are sufficient. This allows for calibrating the system
at startup. By this, deviations in the calibration due to transport and storage can be compensated. The estimation quality
and consistency are evaluated in dependency of the traveled trajectories and the amount of IMU-camera displacement and
rotation misalignment. It is analyzed, how different types of visual markers, i.e. 2- and 3-dimensional patterns, effect the
estimation. Moreover, the method is applied to mono and stereo vision systems, providing information on the applicability
to robot systems. The algorithm is implemented using a modular software framework, such that it can be adopted to altered
conditions easily.
[1]
Giulio Fontana,et al.
Rawseeds ground truth collection systems for indoor self-localization and mapping
,
2009,
Auton. Robots.
[2]
Andrew Zisserman,et al.
Multiple View Geometry
,
1999
.
[3]
Gaurav S. Sukhatme,et al.
Fast Relative Pose Calibration for Visual and Inertial Sensors
,
2008,
ISER.
[4]
James L. Crowley.
World modeling and position estimation for a mobile robot using ultrasonic ranging
,
1989,
Proceedings, 1989 International Conference on Robotics and Automation.
[5]
Reinhard Koch,et al.
A Novel State Parametrization for Stereo-SLAM
,
2012,
VISAPP.
[6]
Simon Lacroix,et al.
Vision-Based SLAM: Stereo and Monocular Approaches
,
2007,
International Journal of Computer Vision.
[7]
Jorge Dias,et al.
Relative Pose Calibration Between Visual and Inertial Sensors
,
2007,
Int. J. Robotics Res..
[8]
Herman Bruyninckx,et al.
Kalman filters for non-linear systems: a comparison of performance
,
2004
.
[9]
Frank Kirchner,et al.
IMU-aided stereo visual odometry for ground-tracking AUV applications
,
2010,
OCEANS'10 IEEE SYDNEY.
[10]
Stergios I. Roumeliotis,et al.
A Kalman Filter-Based Algorithm for IMU-Camera Calibration: Observability Analysis and Performance Evaluation
,
2008,
IEEE Transactions on Robotics.
[11]
Thomas B. Schön,et al.
Relative pose calibration of a spherical camera and an IMU
,
2008,
2008 7th IEEE/ACM International Symposium on Mixed and Augmented Reality.
[12]
Reinhard Koch,et al.
Statistical Analysis of Kalman Filters by Conversion to Gauss-Helmert Models with Applications to Process Noise Estimation
,
2010,
2010 20th International Conference on Pattern Recognition.
[13]
R. Koch,et al.
CALIBRATION OF A PMD-CAMERA USING A PLANAR CALIBRATION PATTERN TOGETHER WITH A MULTI-CAMERA SETUP
,
2008
.