A General Approach to Terrain Relative Navigation for Planetary Landing

We describe an algorithm for navigation state estimation during planetary descent to enable precision landing. The algorithm automatically produces 2D-to-3D correspondences between descent images and a surface map and 2D-to-2D correspondences through a sequence of descent images. These correspondences are combined with inertial measurements in an extended Kalman filter that estimates lander position, velocity and attitude as well as the time varying biases of the inertial measurements. The filter tightly couples inertial and camera measurements in a resource-adaptive and hence real-time capable fashion. Results from a sounding rocket test, covering the dynamic profile of typical planetary landing scenarios, show estimation errors of magnitude 0.16 m/s in velocity and 6.4 m in position at touchdown. These results vastly improve current state of the art and meet the requirements of future planetary exploration missions.

[1]  J. Junkins,et al.  Optimal Attitude and Position Determination from Line-of-Sight Measurements , 2000 .

[2]  D. Kubitschek,et al.  Deep Impact: 19 gigajoules can make quite an impression , 2001 .

[3]  Takeo Kanade,et al.  An Iterative Image Registration Technique with an Application to Stereo Vision , 1981, IJCAI.

[4]  D.S. Bayard,et al.  An estimation algorithm for vision-based exploration of small bodies in space , 2005, Proceedings of the 2005, American Control Conference, 2005..

[5]  Joseph Goguen,et al.  MOC2DIMES: A Camera Simulator for the Mars Exploration Rover Descent Image Motion Estimation System , 2005 .

[6]  G LoweDavid,et al.  Distinctive Image Features from Scale-Invariant Keypoints , 2004 .

[7]  Eric N. Johnson,et al.  Investigation of Methods for Simultaneous Localization and Mapping Using Vision Sensors , 2006 .

[8]  Yang Cheng,et al.  Landmark Based Position Estimation for Pinpoint Landing on Mars , 2005 .

[9]  Stergios I. Roumeliotis,et al.  Vision-Aided Inertial Navigation for Precise Planetary Landing: Analysis and Experiments , 2007, Robotics: Science and Systems.

[10]  Andrew Johnson,et al.  Suborbital Flight Test of a Prototype Terrain-Relative Navigation System , 2007 .

[11]  Stergios I. Roumeliotis,et al.  A Multi-State Constraint Kalman Filter for Vision-aided Inertial Navigation , 2007, Proceedings 2007 IEEE International Conference on Robotics and Automation.

[12]  S. Sukkarieh,et al.  Autonomous airborne navigation in unknown terrain environments , 2004, IEEE Transactions on Aerospace and Electronic Systems.

[13]  Eric N. Johnson,et al.  Vision-Aided Inertial Navigation for Flight Control , 2005, J. Aerosp. Comput. Inf. Commun..

[14]  James S. Sobek,et al.  Digital Scene Matching Area Correlator (DSMAC) , 1980, Optics & Photonics.

[15]  Larry H. Matthies,et al.  MER-DIMES: a planetary landing application of computer vision , 2005, 2005 IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR'05).

[16]  Larry Matthies,et al.  Space Flight Test of Vision-Guided Planetary Landing System , 2007 .

[17]  Stergios I. Roumeliotis,et al.  Augmenting inertial navigation with image-based motion estimation , 2002, Proceedings 2002 IEEE International Conference on Robotics and Automation (Cat. No.02CH37292).

[18]  Eric N. Johnson,et al.  Vision-Aided Inertial Navigation for Flight Control , 2005 .

[19]  Jacob Willem Langelaan State estimation for autonomous flight in cluttered environments , 2006 .

[20]  Stergios I. Roumeliotis,et al.  Vision‐aided inertial navigation for pin‐point landing using observations of mapped landmarks , 2007, J. Field Robotics.