Extrinsic Parameter Calibration Method for a Visual/Inertial Integrated System with a Predefined Mechanical Interface

For a visual/inertial integrated system, the calibration of extrinsic parameters plays a crucial role in ensuring accurate navigation and measurement. In this work, a novel extrinsic parameter calibration method is developed based on the geometrical constraints in the object space and is implemented by manual swing. The camera and IMU frames are aligned to the system body frame, which is predefined by the mechanical interface. With a swinging motion, the fixed checkerboard provides constraints for calibrating the extrinsic parameters of the camera, whereas angular velocity and acceleration provides constraints for calibrating the extrinsic parameters of the IMU. We exploit the complementary nature of both the camera and IMU, of which the latter assists in the checkerboard corner detection and correction while the former suppresses the effects of IMU drift. The results of the calibration experiment reveal that the extrinsic parameter accuracy reaches 0.04° for each Euler angle and 0.15 mm for each position vector component (1σ).

[1]  Mark E. Pittelkau,et al.  Kalman Filtering for Spacecraft System Alignment Calibration , 2001 .

[2]  A. Pinz,et al.  Calibration of Hybrid Vision / Inertial Tracking Systems * , 2005 .

[3]  Alberto Quattrini Li,et al.  SVIn2: Sonar Visual-Inertial SLAM with Loop Closure for Underwater Navigation , 2018, ArXiv.

[4]  Agostino Martinelli,et al.  Visual-inertial structure from motion: Observability and resolvability , 2013, 2013 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[5]  Robert J. Hanisch,et al.  Deconvolution of Hubbles Space Telescope images and spectra , 1996 .

[6]  Jorge Lobo,et al.  Camera-Inertial Sensor modelling and alignment for Visual Navigation , 2003 .

[7]  Qingyu Meng,et al.  An Optimized Tightly-Coupled VIO Design on the Basis of the Fused Point and Line Features for Patrol Robot Navigation , 2019, Sensors.

[8]  J. Apolinar Muñoz Rodríguez,et al.  Binocular self-calibration performed via adaptive genetic algorithm based on laser line imaging , 2016 .

[9]  Laszlo Kis,et al.  Calibration and Testing Issues of the Vision, Inertial Measurement and Control System of an Autonomous Indoor Quadrotor Helicopter , 2008 .

[10]  Agus Budiyono,et al.  Principles of GNSS, Inertial, and Multi-sensor Integrated Navigation Systems , 2012 .

[11]  L. Lucy An iterative technique for the rectification of observed distributions , 1974 .

[12]  J. L. Roux An Introduction to the Kalman Filter , 2003 .

[13]  Hutao Cui,et al.  Vision-aided inertial navigation for pinpoint planetary landing , 2007 .

[14]  D S Biggs,et al.  Acceleration of iterative image restoration algorithms. , 1997, Applied optics.

[15]  Zhengyou Zhang,et al.  A Flexible New Technique for Camera Calibration , 2000, IEEE Trans. Pattern Anal. Mach. Intell..

[16]  Alexander Wendel,et al.  Extrinsic Parameter Calibration for Line Scanning Cameras on Ground Vehicles with Navigation Systems Using a Calibration Pattern , 2017, Sensors.

[17]  Peter F. Sturm,et al.  Pinhole Camera Model , 2014, Computer Vision, A Reference Guide.

[18]  Mohammed El-Diasty,et al.  Calibration and Stochastic Modelling of Inertial Navigation Sensor Erros , 2008 .

[19]  Andreas Geiger,et al.  Automatic camera and range sensor calibration using a single shot , 2012, 2012 IEEE International Conference on Robotics and Automation.

[20]  Yi Lin,et al.  A low-cost multi-sensoral mobile mapping system and its feasibility for tree measurements , 2010 .

[21]  Naser El-Sheimy,et al.  A new multi-position calibration method for MEMS inertial navigation systems , 2007 .

[22]  Roland Siegwart,et al.  Iterated extended Kalman filter based visual-inertial odometry using direct photometric feedback , 2017, Int. J. Robotics Res..

[23]  Aggelos K. Katsaggelos,et al.  Digital Image Restoration: Springer Series in Information Sciences , 1991 .

[24]  Jorge Dias,et al.  Relative Pose Calibration Between Visual and Inertial Sensors , 2007, Int. J. Robotics Res..

[25]  Ronald Azuma,et al.  Hybrid inertial and vision tracking for augmented reality registration , 1999, Proceedings IEEE Virtual Reality (Cat. No. 99CB36316).

[26]  Stergios I. Roumeliotis,et al.  Vision-Aided Inertial Navigation for Spacecraft Entry, Descent, and Landing , 2009, IEEE Transactions on Robotics.

[27]  Bernhard P. Wrobel,et al.  Multiple View Geometry in Computer Vision , 2001 .

[28]  Jorge Dias,et al.  Vision and Inertial Sensor Cooperation Using Gravity as a Vertical Reference , 2003, IEEE Trans. Pattern Anal. Mach. Intell..

[29]  Wei Kang,et al.  Integrated vision/inertial navigation system design using nonlinear filtering , 1999, Proceedings of the 1999 American Control Conference (Cat. No. 99CH36251).

[30]  Shaojie Shen,et al.  Monocular Visual–Inertial State Estimation With Online Initialization and Camera–IMU Extrinsic Calibration , 2017, IEEE Transactions on Automation Science and Engineering.

[31]  Min Yu,et al.  Mobile Robot Indoor Positioning Based on a Combination of Visual and Inertial Sensors , 2019, Sensors.

[32]  Gaurav S. Sukhatme,et al.  Fast Relative Pose Calibration for Visual and Inertial Sensors , 2008, ISER.

[33]  Eric Foxlin,et al.  Miniaturization, calibration & accuracy evaluation of a hybrid self-tracker , 2003, The Second IEEE and ACM International Symposium on Mixed and Augmented Reality, 2003. Proceedings..

[34]  Zheng You,et al.  Error Analysis and Calibration Method of a Multiple Field-of-View Navigation System , 2017, Sensors.