Performance Analysis for Visual Planetary Landing Navigation Using Optical Flow and DEM Matching

Visual navigation for planetary landing vehicles shows many scientific and technical challenges due to inclined and rather high velocity approach trajectories, complex 3D environment and high computational requirements for real-time image processing. High relative navigation accuracy at landing site is required for obstacle avoidance and operational constraints. The current paper discusses detailed performance analysis results for a recently published concept of a visual navigation system, based on a mono camera as vision sensor and matching of the recovered and reference 3D models of the landing site. The recovered 3D models are being produced by real-time, instantaneous optical flow processing of the navigation camera images. An embedded optical correlator is introduced, which allows a robust and ultra high-speed optical flow processing under different and even unfavorable illumination conditions. The performance analysis is based on a detailed software simulation model of the visual navigation system, including the optical correlator as the key component for ultra-high speed image processing. The paper recalls the general structure of the navigation system and presents detailed end-to-end visual navigation performance results for a Mercury landing reference mission in terms of different visual navigation entry conditions, reference DEM resolution, navigation camera configuration and auxiliary sensor information.

[1]  Klaus Janschek,et al.  Airborne test results for smart pushbroom imaging system with optoelectronic image correction , 2004, SPIE Remote Sensing.

[2]  Andrew E. Johnson,et al.  Machine vision for autonomous small body navigation , 2000, 2000 IEEE Aerospace Conference. Proceedings (Cat. No.00TH8484).

[3]  Klaus Janschek,et al.  An Embedded Optical Flow Processor for Visual Navigation using Optical Correlator Technology , 2006, 2006 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[4]  K. Janschek,et al.  Opto-Mechatronic Image Stabilization for a Compact Space Camera , 2004 .

[5]  J. F. Montgomery A helicopter testbed for autonomous vision-guided safe and precise landing , 2002 .

[6]  Klaus Janschek,et al.  Space Application of a Self-Calibrating Optical Processor for Harsh Mechanical Environment , 2000 .

[7]  J. Goodman Introduction to Fourier optics , 1969 .

[8]  William K. Pratt,et al.  Correlation Techniques of Image Registration , 1974, IEEE Transactions on Aerospace and Electronic Systems.

[9]  Suganda Jutamulia,et al.  Joint transform correlators and their applications , 1992, Other Conferences.

[10]  L. Matthies,et al.  Precise Image-Based Motion Estimation for Autonomous Small Body Exploration , 2000 .

[11]  Berthold K. P. Horn,et al.  Determining Optical Flow , 1981, Other Conferences.

[12]  Stergios I. Roumeliotis,et al.  Augmenting inertial navigation with image-based motion estimation , 2002, Proceedings 2002 IEEE International Conference on Robotics and Automation (Cat. No.02CH37292).