Fusing Stereo Camera and Low-Cost Inertial Measurement Unit for Autonomous Navigation in a Tightly-Coupled Approach

Exact motion estimation is a major task in autonomous navigation. The integration of Inertial Navigation Systems (INS) and the Global Positioning System (GPS) can provide accurate location estimation, but cannot be used in a GPS denied environment. In this paper, we present a tight approach to integrate a stereo camera and low-cost inertial sensor. This approach takes advantage of the inertial sensor’s fast response and visual sensor’s slow drift. In contrast to previous approaches, features both near and far from the camera are simultaneously taken into consideration in the visual-inertial approach. The near features are parameterised in three dimensional (3D) Cartesian points which provide range and heading information, whereas the far features are initialised in Inverse Depth (ID) points which provide bearing information. In addition, the inertial sensor biases and a stationary alignment are taken into account. The algorithm employs an Iterative Extended Kalman Filter (IEKF) to estimate the motion of the system, the biases of the inertial sensors and the tracked features over time. An outdoor experiment is presented to validate the proposed algorithm and its accuracy.

[1]  Andrew J. Davison,et al.  Mobile Robot Navigation Using Active Vision , 1998 .

[2]  Andrew J. Davison,et al.  Real-time simultaneous localisation and mapping with a single camera , 2003, Proceedings Ninth IEEE International Conference on Computer Vision.

[3]  A. Pinz,et al.  Calibration of Hybrid Vision / Inertial Tracking Systems * , 2005 .

[4]  Yuanxin Wu,et al.  On 'A Kalman Filter-Based Algorithm for IMU-Camera Calibration: Observability Analysis and Performance Evaluation' , 2013, ArXiv.

[5]  Javier Civera,et al.  Inverse Depth Parametrization for Monocular SLAM , 2008, IEEE Transactions on Robotics.

[6]  Carlo Tomasi,et al.  Good features to track , 1994, 1994 Proceedings of IEEE Conference on Computer Vision and Pattern Recognition.

[7]  Javier Civera,et al.  1-Point RANSAC for extended Kalman filtering: Application to real-time structure from motion and visual odometry , 2010 .

[8]  Michael Veth,et al.  Fusing Low-Cost Image and Inertial Sensors for Passive Navigation , 2007 .

[9]  Javier Civera,et al.  1‐Point RANSAC for extended Kalman filtering: Application to real‐time structure from motion and visual odometry , 2010, J. Field Robotics.

[10]  Richard Szeliski,et al.  Computer Vision - Algorithms and Applications , 2011, Texts in Computer Science.

[11]  John Weston,et al.  Strapdown Inertial Navigation Technology , 1997 .

[12]  Olivier Stasse,et al.  MonoSLAM: Real-Time Single Camera SLAM , 2007, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[13]  David W. Murray,et al.  Simultaneous Localization and Map-Building Using Active Vision , 2002, IEEE Trans. Pattern Anal. Mach. Intell..

[14]  Wenqi Wu,et al.  Observability Analysis of a Matrix Kalman Filter-Based Navigation System Using Visual/Inertial/Magnetic Sensors , 2012, Sensors.

[15]  James Diebel,et al.  Representing Attitude : Euler Angles , Unit Quaternions , and Rotation Vectors , 2006 .

[16]  Lina María Paz,et al.  Large-Scale 6-DOF SLAM With Stereo-in-Hand , 2008, IEEE Transactions on Robotics.

[17]  Markus Vincze,et al.  Fast Ego-motion Estimation with Multi-rate Fusion of Inertial and Vision , 2007, Int. J. Robotics Res..

[18]  Gaurav S. Sukhatme,et al.  Visual-Inertial Sensor Fusion: Localization, Mapping and Sensor-to-Sensor Self-calibration , 2011, Int. J. Robotics Res..

[19]  Gaurav S. Sukhatme,et al.  Combined Visual and Inertial Navigation for an Unmanned Aerial Vehicle , 2008, FSR.

[20]  Rogelio Lozano,et al.  Combining Stereo Vision and Inertial Navigation System for a Quad-Rotor UAV , 2011, J. Intell. Robotic Syst..

[21]  Peter Corke,et al.  An Introduction to Inertial and Visual Sensing , 2007, Int. J. Robotics Res..

[22]  Girish Chowdhary,et al.  GPS‐denied Indoor and Outdoor Monocular Vision Aided Navigation and Control of Unmanned Aircraft , 2013, J. Field Robotics.

[23]  Aboelmagd Noureldin,et al.  GPS/INS integration utilizing dynamic neural networks for vehicular navigation , 2011, Inf. Fusion.

[24]  Jorge Dias,et al.  Relative Pose Calibration Between Visual and Inertial Sensors , 2007, Int. J. Robotics Res..

[25]  Tom Drummond,et al.  Fusing points and lines for high performance tracking , 2005, Tenth IEEE International Conference on Computer Vision (ICCV'05) Volume 1.

[26]  Sangkyung Sung,et al.  IMU/Vision/Lidar integrated navigation system in GNSS denied environments , 2013, 2013 IEEE Aerospace Conference.

[27]  Tom Drummond,et al.  Machine Learning for High-Speed Corner Detection , 2006, ECCV.

[28]  Stergios I. Roumeliotis,et al.  A Multi-State Constraint Kalman Filter for Vision-aided Inertial Navigation , 2007, Proceedings 2007 IEEE International Conference on Robotics and Automation.

[29]  James J. Little,et al.  Mobile Robot Localization and Mapping with Uncertainty using Scale-Invariant Visual Landmarks , 2002, Int. J. Robotics Res..

[30]  Salah Sukkarieh,et al.  Airborne simultaneous localisation and map building , 2003, 2003 IEEE International Conference on Robotics and Automation (Cat. No.03CH37422).