Abstract. With the development of space technology, more and more lunar researches are performed by different countries. For the lunar landing mission success, the lunar landing module should equip with advanced Positioning and Orientation System (POS) for the navigation requirements. For the pinpoint landing mission formulated by NASA, a good POS with error less than 100 meters is needed in order to make the lunar module land safely at the exact destination on lunar surface. However, the existing technologies for lunar navigation, such as satellite positioning and star tracker, have poor performance for the navigation requirements. The visual-based positioning technology is an alternative way to make sure a lunar landing module reaches the destination. There are two types of visual-based positioning technology, absolute and relative navigation. The relative navigation system can provide the solution at a higher rate, but the error would accumulate over time. On the contrary, the absolute navigation could provide an initial position or updates of position and attitude for relative navigation. Thus, the integrated navigation system from those two methods can take advantage of both stand-alone systems. On the other hand, the Inertial Navigation System (INS) can help it overcome the disadvantage that the images much closer to the lunar surface are not available. This study shows an integrated navigation system that integrates a visual-based navigation system and an INS, which is implemented in a simulated lunar surface.
[1]
Bach Van Pham.
Vision-based absolute navigation for interplanetary spacecraft descent and landing. Système de navigation absolue pour l'atterissage d'une sonde interplanétaire
,
2010
.
[2]
Stergios I. Roumeliotis,et al.
Vision‐aided inertial navigation for pin‐point landing using observations of mapped landmarks
,
2007,
J. Field Robotics.
[3]
Yang Cheng,et al.
Landmark Based Position Estimation for Pinpoint Landing on Mars
,
2005,
Proceedings of the 2005 IEEE International Conference on Robotics and Automation.
[4]
J. L. Roux.
An Introduction to the Kalman Filter
,
2003
.
[5]
Bin Jiang,et al.
Performance analysis of a federated ultra-tight global positioning system/inertial navigation system integration algorithm in high dynamic environments
,
2015
.
[6]
David G. Lowe,et al.
Object recognition from local scale-invariant features
,
1999,
Proceedings of the Seventh IEEE International Conference on Computer Vision.
[7]
Stergios I. Roumeliotis,et al.
Vision-Aided Inertial Navigation for Precise Planetary Landing: Analysis and Experiments
,
2007,
Robotics: Science and Systems.