Biomimetic optic flow sensing applied to a lunar landing scenario

Autonomous landing on unknown extraterrestrial bodies requires fast, noise-resistant motion processing to elicit appropriate steering commands. Flying insects excellently master visual motion sensing techniques to cope with highly parallel data at a low energy cost, using dedicated motion processing circuits. Results obtained in neurophysiological, behavioural, and biorobotic studies on insect flight control were used to safely land a spacecraft on the Moon in a simulated environment. ESA's Advanced Concepts Team has identified autonomous lunar landing as a relevant situation for testing the potential applications of innovative bio-inspired visual guidance systems to space missions. Biomimetic optic flow-based strategies for controlling automatic landing were tested in a very realistic simulated Moon environment. Visual information was provided using the PANGU software program and used to regulate the optic flow generated during the landing of a two degrees of freedom spacecraft. The results of the simulation showed that a single elementary motion detector coupled to a regulator robustly controlled the autonomous descent and the approach of the simulated moonlander. “Low gate” located approximately 10 m above the ground was reached with acceptable vertical and horizontal speeds of 4 m/s and 5 m/s, respectively. It was also established that optic flow sensing methods can be used successfully to cope with temporary sensor blinding and poor lighting conditions.

[1]  Steve Parkes,et al.  LIDAR-Based GNC for Planetary Landing: Simulation with PANGU , 2003 .

[2]  R. Hetherington The Perception of the Visual World , 1952 .

[3]  G. Horridge The evolution of visual processing and the construction of seeing systems , 1987, Proceedings of the Royal Society of London. Series B. Biological Sciences.

[4]  Phil Palmer,et al.  THE DYNAMIC MOTION ESTIMATION OF A LUNAR LANDER USING OPTICAL NAVIGATION , 2009 .

[5]  Stephen Parkes,et al.  Planet Surface Simulation with PANGU , 2004 .

[6]  Mandyam V. Srinivasan,et al.  Bioinspired Engineering of Exploration Systems for NASA and DoD , 2002, Artificial Life.

[7]  R. Preiss,et al.  Motion parallax and figural properties of depth control flight speed in an insect , 1987, Biological Cybernetics.

[8]  M. Srinivasan,et al.  Visual control of flight speed in honeybees , 2005, Journal of Experimental Biology.

[9]  Svetha Venkatesh,et al.  How honeybees make grazing landings on flat surfaces , 2000, Biological Cybernetics.

[10]  Stéphane Viollet,et al.  Bio-inspired optical flow circuits for the visual guidance of micro air vehicles , 2003, Proceedings of the 2003 International Symposium on Circuits and Systems, 2003. ISCAS '03..

[11]  Zhang,et al.  Honeybee navigation en route to the goal: visual flight control and odometry , 1996, The Journal of experimental biology.

[12]  Klaus Janschek,et al.  Performance Analysis for Visual Planetary Landing Navigation Using Optical Flow and DEM Matching , 2006 .

[13]  Alexa Riehle,et al.  Directionally Selective Motion Detection by Insect Neurons , 1989 .

[14]  Nicolas Franceschini,et al.  A bee in the corridor: centering and wall-following , 2008, Naturwissenschaften.

[15]  Nicolas H. Franceschini,et al.  Optic flow regulation: the key to aircraft automatic guidance , 2005, Robotics Auton. Syst..

[16]  N. Franceschini,et al.  A Bio-Inspired Flying Robot Sheds Light on Insect Piloting Abilities , 2007, Current Biology.

[17]  Mandyam V. Srinivasan,et al.  Visual Control of Flight Speed and Height in the Honeybee , 2006, SAB.

[18]  Stein Strandmoe,et al.  Toward a vision based autonomous planetary lander , 1999 .

[19]  S. M. Parkes,et al.  GNC Sensors for Planetary Landers - A Review , 2001 .

[20]  Nicolas Franceschini,et al.  TOWARDS AUTOMATIC VISUAL GUIDANCE OF AEROSPACE VEHICLES: FROM INSECTS TO ROBOTS , 2008 .

[21]  Stergios I. Roumeliotis,et al.  Vision-Aided Inertial Navigation for Spacecraft Entry, Descent, and Landing , 2009, IEEE Transactions on Robotics.

[22]  Allan W. Snyder,et al.  Acuity of compound eyes: Physical limitations and design , 2004, Journal of comparative physiology.

[23]  Klaus Janschek,et al.  An Embedded Optical Flow Processor for Visual Navigation using Optical Correlator Technology , 2006, 2006 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[24]  Stéphane Viollet,et al.  Optic Flow Based Visual Guidance: From Flying Insects to Miniature Aerial Vehicles , 2009 .

[25]  M. Srinivasan,et al.  Range perception through apparent image speed in freely flying honeybees , 1991, Visual Neuroscience.