Self-Calibration of Inertial and Omnidirectional Visual Sensors for Navigation and Mapping

Omnidirectional cameras are versatile sensors that are able to provide a full 360-degree view of the environment. When combined with inertial sensing, omnidirectional vision offers a potentially robust navigation solution. However, to correctly fuse the data from an omnidirectional camera and an inertial measurement unit (IMU) into a single navigation frame, the 6-DOF transform between the sensors must be accurately known. In this paper we describe an algorithm, based on the unscented Kalman filter, for self-calibration of the transform between an omnidirectional camera and an IMU. We show that the IMU biases, the local gravity vector, and the metric scene structure can also be recovered from camera and IMU measurements. Further, our approach does not require any additional hardware or prior knowledge about the environment in which a robot is operating. We present results from calibration experiments with an omnidirectional camera and a low-cost IMU, which demonstrate accurate self- calibration of the 6-DOF sensor-to-sensor transform. I. INTRODUCTION

[1]  C. Moog,et al.  Algebraic Methods for Nonlinear Control Systems , 2006 .

[2]  Stergios I. Roumeliotis,et al.  A Multi-State Constraint Kalman Filter for Vision-aided Inertial Navigation , 2007, Proceedings 2007 IEEE International Conference on Robotics and Automation.

[3]  Gaurav S. Sukhatme,et al.  Visual-inertial simultaneous localization, mapping and sensor-to-sensor self-calibration , 2009, 2009 IEEE International Symposium on Computational Intelligence in Robotics and Automation - (CIRA).

[4]  Jeffrey K. Uhlmann,et al.  Unscented filtering and nonlinear estimation , 2004, Proceedings of the IEEE.

[5]  Sanjiv Singh,et al.  Motion Estimation from Image and Inertial Measurements , 2004, Int. J. Robotics Res..

[6]  Stefano Soatto,et al.  Structure from Motion Causally Integrated Over Time , 2002, IEEE Trans. Pattern Anal. Mach. Intell..

[7]  H. Ishiguro,et al.  Panoramic Vision , 2001, Monographs in Computer Science.

[8]  Javier Civera,et al.  Unified Inverse Depth Parametrization for Monocular SLAM , 2006, Robotics: Science and Systems.

[9]  A. B. Chatfield Fundamentals of high accuracy inertial navigation , 1997 .

[10]  A. Krener,et al.  Nonlinear controllability and observability , 1977 .

[11]  Jorge Dias,et al.  Relative Pose Calibration Between Visual and Inertial Sensors , 2007, Int. J. Robotics Res..

[12]  Jonathan Kelly On the Observability and Self-Calibration of Visual-Inertial Navigation Systems , 2008 .

[13]  A. Vedaldi,et al.  Inertial Structure From Motion with Autocalibration , 2007 .

[14]  Roland Siegwart,et al.  A Toolbox for Easily Calibrating Omnidirectional Cameras , 2006, 2006 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[15]  F. Markley,et al.  Unscented Filtering for Spacecraft Attitude Estimation , 2003 .

[16]  G LoweDavid,et al.  Distinctive Image Features from Scale-Invariant Keypoints , 2004 .

[17]  Thomas B. Schön,et al.  Relative pose calibration of a spherical camera and an IMU , 2008, 2008 7th IEEE/ACM International Symposium on Mixed and Augmented Reality.

[18]  Joseph J. LaViola,et al.  On Kalman Filtering With Nonlinear Equality Constraints , 2007, IEEE Transactions on Signal Processing.

[19]  Peter I. Corke,et al.  Omnidirectional visual odometry for a planetary rover , 2004, 2004 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS) (IEEE Cat. No.04CH37566).