Tightly Integrating Optical and Inertial Sensors for Navigation Using the Ukf

Abstract : The motivation of this research is to address the benefits of tightly integrating optical and inertial sensors where GNSS signals are not available. The research begins with describing the navigation problem. Then, error and measurement models are presented. Given a set of features, a feature detection and projection algorithm is developed which utilizes inertial measurements to predict vectors in the feature space between images. The unscented Kalman filter is applied to the navigation system using the inertial measurements and feature matches to estimate the navigation trajectory. Finally, the image-aided navigation algorithm is tested using a simulation and an experiment. As a result, the optical measurements combined with the inertial sensors result in improved performance for non-GNSS based navigation.

[1]  M. Veth,et al.  Stochastic constraints for efficient image correspondence search , 2006, IEEE Transactions on Aerospace and Electronic Systems.

[2]  D. Chilton Inertial Navigation , 1959, Nature.

[3]  Michael Veth,et al.  Fusion of Imaging and Inertial Sensors for Navigation , 2006 .

[4]  Stergios I. Roumeliotis,et al.  Augmenting inertial navigation with image-based motion estimation , 2002, Proceedings 2002 IEEE International Conference on Robotics and Automation (Cat. No.02CH37292).

[5]  Nando de Freitas,et al.  The Unscented Particle Filter , 2000, NIPS.

[6]  Yan Lu,et al.  Performance Test Results of an Integrated GPS/MEMS Inertial Navigation Package , 2004 .

[7]  John Weston,et al.  Strapdown Inertial Navigation Technology , 1997 .

[8]  Michael Veth,et al.  Fusion of Low-Cost Imaging and Inertial Sensors for Navigation , 2007 .

[9]  Gaurav S. Sukhatme,et al.  A comparison of two camera configurations for optic-flow based navigation of a UAV through urban canyons , 2004, 2004 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS) (IEEE Cat. No.04CH37566).

[10]  C. Liebe Star trackers for attitude determination , 1995 .

[11]  Sanjiv Singh,et al.  Optimal motion estimation from visual and inertial measurements , 2002, Sixth IEEE Workshop on Applications of Computer Vision, 2002. (WACV 2002). Proceedings..

[12]  John F. Raquet,et al.  Navigation Using Optical Measurements of Objects at Unknown Locations , 2003 .

[13]  Michael Veth,et al.  Two-Dimensional Stochastic Projections for Tight Integration of Optical and Inertial Sensors for Navigation , 2007 .

[14]  Michael Veth,et al.  Alignment and Calibration of Optical and Inertial Sensors Using Stellar Observations , 2007 .

[15]  Mursy Polat,et al.  INS Aiding by Tracking an Unknown Ground Object , 2002 .

[16]  Eun-Hwan Shin,et al.  Unscented Kalman Filter and Attitude Errors of Low-Cost Inertial Navigation Systems , 2007 .

[17]  Richard P. Paul,et al.  A comparison of transforms and quaternions in robotics , 1988, Proceedings. 1988 IEEE International Conference on Robotics and Automation.

[18]  M. Bollner,et al.  The ROSAT star tracker-flight experience , 1991 .

[19]  Bir Bhanu,et al.  Inertial navigation sensor integrated motion analysis for obstacle detection , 1990, Proceedings., IEEE International Conference on Robotics and Automation.

[20]  W. Hamilton,et al.  Quaternions and Rotations in 3-Space: , 1998 .

[21]  Richard A. Brown,et al.  Introduction to random signals and applied kalman filtering (3rd ed , 2012 .

[22]  S. Haykin Kalman Filtering and Neural Networks , 2001 .