Tightly-coupled stereo vision-aided inertial navigation using feature-based motion sensors

A tightly-coupled stereo vision-aided inertial navigation system is proposed in this work, as a synergistic incorporation of vision with other sensors. In order to avoid loss of information possibly resulting by visual preprocessing, a set of feature-based motion sensors and an inertial measurement unit are directly fused together to estimate the vehicle state. Two alternative feature-based observation models are considered within the proposed fusion architecture. The first model uses the trifocal tensor to propagate feature points by homography, so as to express geometric constraints among three consecutive scenes. The second one is derived by using a rigid body motion model applied to three-dimensional (3D) reconstructed feature points. A kinematic model accounts for the vehicle motion, and a Sigma-Point Kalman filter is used to achieve a robust state estimation in the presence of non-linearities. The proposed formulation is derived for a general platform-independent 3D problem, and it is tested and demonstrated with a real dynamic indoor data-set alongside of a simulation experiment. Results show improved estimates than in the case of a classical visual odometry approach and of a loosely-coupled stereo vision-aided inertial navigation system, even in GPS (Global Positioning System)-denied conditions and when magnetometer measurements are not reliable. Graphical Abstract

[1]  Julius Ziegler,et al.  StereoScan: Dense 3d reconstruction in real-time , 2011, 2011 IEEE Intelligent Vehicles Symposium (IV).

[2]  C. Chang,et al.  Kalman filter algorithms for a multi-sensor system , 1976, 1976 IEEE Conference on Decision and Control including the 15th Symposium on Adaptive Processes.

[3]  Roland Siegwart,et al.  Real-time metric state estimation for modular vision-inertial systems , 2011, 2011 IEEE International Conference on Robotics and Automation.

[4]  Qiong-hai Dai,et al.  Vision aided unmanned aerial vehicle autonomy: An overview , 2010, 2010 3rd International Congress on Image and Signal Processing.

[5]  Kiyoshi Irie,et al.  Outdoor Localization Using Stereo Vision Under Various Illumination Conditions , 2012, Adv. Robotics.

[6]  Andreas Geiger,et al.  Efficient Large-Scale Stereo Matching , 2010, ACCV.

[7]  John R. Goulding Biologically-inspired image-based sensor fusion approach to compensate gyro sensor drift in mobile robot systems that balance , 2010, 2010 IEEE Conference on Multisensor Fusion and Integration.

[8]  Thomas B. Schon,et al.  Tightly coupled UWB/IMU pose estimation , 2009, 2009 IEEE International Conference on Ultra-Wideband.

[9]  Stergios I. Roumeliotis,et al.  Augmenting inertial navigation with image-based motion estimation , 2002, Proceedings 2002 IEEE International Conference on Robotics and Automation (Cat. No.02CH37292).

[10]  Fumiya Iida,et al.  Biologically inspired visual odometer for navigation of a flying robot , 2003, Robotics Auton. Syst..

[11]  Xinyan Deng,et al.  Attitude Estimation of a Biologically Inspired Robotic Housefly via Multimodal Sensor Fusion , 2009, Adv. Robotics.

[12]  James R. Bergen,et al.  Visual odometry for ground vehicle applications , 2006, J. Field Robotics.

[13]  M. Veth,et al.  Stochastic constraints for efficient image correspondence search , 2006, IEEE Transactions on Aerospace and Electronic Systems.

[14]  L. D. Liu,et al.  Robust unscented Kalman filtering for nonlinear uncertain systems , 2010 .

[15]  Paul S. Schenker,et al.  Improved Rover State Estimation in Challenging Terrain , 1999, Auton. Robots.

[16]  Mandyam V. Srinivasan,et al.  A Bio-Inspired Stereo Vision System for Guidance of Autonomous Aircraft , 2011 .

[17]  M. Bozorg,et al.  A Decentralized Architecture for Simultaneous Localization and Mapping , 2009, IEEE/ASME Transactions on Mechatronics.

[18]  Stergios I. Roumeliotis,et al.  A Multi-State Constraint Kalman Filter for Vision-aided Inertial Navigation , 2007, Proceedings 2007 IEEE International Conference on Robotics and Automation.

[19]  Roland Siegwart,et al.  Real-time onboard visual-inertial state estimation and self-calibration of MAVs in unknown environments , 2012, 2012 IEEE International Conference on Robotics and Automation.

[20]  Jeffrey K. Uhlmann,et al.  Simultaneous localisation and map building using split covariance intersection , 2001, Proceedings 2001 IEEE/RSJ International Conference on Intelligent Robots and Systems. Expanding the Societal Role of Robotics in the the Next Millennium (Cat. No.01CH37180).

[21]  Hugh F. Durrant-Whyte,et al.  Simultaneous localization and mapping: part I , 2006, IEEE Robotics & Automation Magazine.

[22]  Hugh Durrant-Whyte,et al.  Simultaneous Localisation and Mapping ( SLAM ) : Part I The Essential Algorithms , 2006 .

[23]  Songmin Jia,et al.  Robot Localization in Indoor Environments Using Radio Frequency Identification Technology and Stereo Vision , 2008, Adv. Robotics.

[24]  Sanjiv Singh,et al.  Motion Estimation from Image and Inertial Measurements , 2004, Int. J. Robotics Res..

[25]  Fraser Dalgleish,et al.  Vision-based navigation of unmanned underwater vehicles : a survey. Part 2: Vision-basedstation-keeping and positioning , 2005 .

[26]  R Chellappa,et al.  Robust structure from motion estimation using inertial data. , 2001, Journal of the Optical Society of America. A, Optics, image science, and vision.

[27]  Andreas Geiger,et al.  Visual odometry based on stereo image sequences with RANSAC-based outlier rejection scheme , 2010, 2010 IEEE Intelligent Vehicles Symposium.

[28]  Stergios I. Roumeliotis,et al.  SC-KF Mobile Robot Localization: A Stochastic Cloning Kalman Filter for Processing Relative-State Measurements , 2007, IEEE Transactions on Robotics.

[29]  Stergios I. Roumeliotis,et al.  Stochastic cloning: a generalized framework for processing relative state measurements , 2002, Proceedings 2002 IEEE International Conference on Robotics and Automation (Cat. No.02CH37292).

[30]  Kurt Konolige,et al.  Large-Scale Map-Making , 2004, AAAI.

[31]  Jonghyuk Kim,et al.  Six DoF Decentralised SLAM , 2003 .

[32]  Bernhard P. Wrobel,et al.  Multiple View Geometry in Computer Vision , 2001 .

[33]  Michael Veth,et al.  Tightly-Coupled INS, GPS, and Imaging Sensors for Precision Geolocation , 2008 .

[34]  Michael Veth,et al.  Tightly-Coupled Image-Aided Inertial Navigation Using the Unscented Kalman Filter , 2007 .

[35]  Peter Corke,et al.  An Introduction to Inertial and Visual Sensing , 2007, Int. J. Robotics Res..

[36]  Patrick Rives,et al.  Real-time Quadrifocal Visual Odometry , 2010, Int. J. Robotics Res..

[37]  Alonzo Kelly,et al.  A new approach to vision-aided inertial navigation , 2010, 2010 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[38]  Francisco Bonin-Font,et al.  Visual Navigation for Mobile Robots: A Survey , 2008, J. Intell. Robotic Syst..

[39]  John J. Leonard,et al.  Explore and return: experimental validation of real-time concurrent mapping and localization , 2002, Proceedings 2002 IEEE International Conference on Robotics and Automation (Cat. No.02CH37292).

[40]  Carlo L. Bottasso,et al.  Tightly-coupled vision-aided inertial navigation via trifocal constraints , 2012, 2012 IEEE International Conference on Robotics and Biomimetics (ROBIO).

[41]  Robert C. Bolles,et al.  Outdoor Mapping and Navigation Using Stereo Vision , 2006, ISER.

[42]  Kin Hong Wong,et al.  Recursive Camera-Motion Estimation With the Trifocal Tensor , 2006, IEEE Transactions on Systems, Man, and Cybernetics, Part B (Cybernetics).

[43]  Minho Lee,et al.  A Biologically Inspired Active Stereo Vision System Using a Bottom-Up Saliency Map Model , 2004, ICAISC.

[44]  Michael Veth,et al.  Two-Dimensional Stochastic Projections for Tight Integration of Optical and Inertial Sensors for Navigation , 2007 .

[45]  W. Burgard,et al.  RAWSEEDS: Robotics Advancement through Web-publishing of Sensorial and Elaborated Extensive Data Sets , 2010 .