High-Precision Image Aided Inertial Navigation with Known Features: Observability Analysis and Performance Evaluation

A high-precision image-aided inertial navigation system (INS) is proposed as an alternative to the carrier-phase-based differential Global Navigation Satellite Systems (CDGNSSs) when satellite-based navigation systems are unavailable. In this paper, the image/INS integrated algorithm is modeled by a tightly-coupled iterative extended Kalman filter (IEKF). Tightly-coupled integration ensures that the integrated system is reliable, even if few known feature points (i.e., less than three) are observed in the images. A new global observability analysis of this tightly-coupled integration is presented to guarantee that the system is observable under the necessary conditions. The analysis conclusions were verified by simulations and field tests. The field tests also indicate that high-precision position (centimeter-level) and attitude (half-degree-level)-integrated solutions can be achieved in a global reference.

[1]  H. D. Black,et al.  A passive system for determining the attitude of a satellite , 1964 .

[2]  Agostino Martinelli,et al.  Closed-Form Solution of Visual-Inertial Structure from Motion , 2013, International Journal of Computer Vision.

[3]  Dimitrios G. Kottas,et al.  Camera-IMU-based localization: Observability analysis and consistency improvement , 2014, Int. J. Robotics Res..

[4]  Stefano Soatto,et al.  Visual-inertial navigation, mapping and localization: A scalable real-time causal approach , 2011, Int. J. Robotics Res..

[5]  Lee Lemay,et al.  Precise input and output error characterization for loosely integrated INS/GPS/camera navigation system , 2011 .

[6]  Demoz Gebre-Egziabher,et al.  Performance comparison of tight and loose INS-camera integration , 2011 .

[7]  Peter Corke,et al.  An Introduction to Inertial and Visual Sensing , 2007, Int. J. Robotics Res..

[8]  Yuanxin Wu,et al.  Self-calibration for Land Navigation Using Inertial Sensors and Odometer: Observability Analysis , 2009 .

[9]  Yuanxin Wu,et al.  Observability of Strapdown INS Alignment: A Global Perspective , 2011, IEEE Transactions on Aerospace and Electronic Systems.

[10]  Francisco Bonin-Font,et al.  Visual Navigation for Mobile Robots: A Survey , 2008, J. Intell. Robotic Syst..

[11]  Avinash C. Kak,et al.  Vision for Mobile Robot Navigation: A Survey , 2002, IEEE Trans. Pattern Anal. Mach. Intell..

[12]  F. Ham,et al.  Observability, Eigenvalues, and Kalman Filtering , 1983, IEEE Transactions on Aerospace and Electronic Systems.

[13]  Guannan Gao,et al.  Probabilistic Hough Transform , 2011 .

[14]  Xiaoji Niu,et al.  Low-end MEMS IMU can contribute in GPS/INS deep integration , 2014, 2014 IEEE/ION Position, Location and Navigation Symposium - PLANS 2014.

[15]  Yuanxin Wu,et al.  INS/GPS Integration: Global Observability Analysis , 2009, IEEE Trans. Veh. Technol..

[16]  Jay A. Farrell,et al.  Real-Time Computer Vision/DGPS-Aided Inertial Navigation System for Lane-Level Vehicle Navigation , 2012, IEEE Transactions on Intelligent Transportation Systems.

[17]  Yuanxin Wu,et al.  On 'A Kalman Filter-Based Algorithm for IMU-Camera Calibration: Observability Analysis and Performance Evaluation' , 2013, ArXiv.

[18]  J. Crassidis,et al.  Observability Analysis of Six-Degree-of-Freedom Configuration Determination Using Vector Observations , 2002 .

[19]  Jean-Yves Bouguet,et al.  Camera calibration toolbox for matlab , 2001 .

[20]  Jinling Wang,et al.  Geometric and Error Analysis for 3D Map-Matching , 2009 .

[21]  Stergios I. Roumeliotis,et al.  A Multi-State Constraint Kalman Filter for Vision-aided Inertial Navigation , 2007, Proceedings 2007 IEEE International Conference on Robotics and Automation.

[22]  D. Simon Optimal State Estimation: Kalman, H Infinity, and Nonlinear Approaches , 2006 .

[23]  Anastasios I. Mourikis,et al.  High-precision, consistent EKF-based visual-inertial odometry , 2013, Int. J. Robotics Res..

[24]  Jinling Wang,et al.  Vision-based Positioning with a Single Camera and 3D Maps: Accuracy and Reliability Analysis , 2011 .

[25]  Michael Veth,et al.  Fusion of Imaging and Inertial Sensors for Navigation , 2006 .

[26]  A. Krener,et al.  Nonlinear controllability and observability , 1977 .

[27]  C. Jekeli Inertial navigation systems with geodetic applications , 2000 .

[28]  Agostino Martinelli,et al.  Vision and IMU Data Fusion: Closed-Form Solutions for Attitude, Speed, Absolute Scale, and Bias Determination , 2012, IEEE Transactions on Robotics.

[29]  Gaurav S. Sukhatme,et al.  Visual-Inertial Sensor Fusion: Localization, Mapping and Sensor-to-Sensor Self-calibration , 2011, Int. J. Robotics Res..

[30]  Dennis M. Akos,et al.  Monocular Camera/IMU/GNSS Integration for Ground Vehicle Navigation in Challenging GNSS Environments , 2012, Sensors.

[31]  Georgia Fotopoulos,et al.  An Overview of Multi-Reference Station Methods for cm-Level Positioning , 2001, GPS Solutions.

[32]  Naser El-Sheimy,et al.  Self-calibration for IMU/Odometer Land Navigation: Simulation and Test Results , 2010 .