A contribution to vision-based autonomous helicopter flight in urban environments

A navigation strategy that exploits the optic flow and inertial information to continuously avoid collisions with both lateral and frontal obstacles has been used to control a simulated helicopter flying autonomously in a textured urban environment. Experimental results demonstrate that the corresponding controller generates cautious behavior, whereby the helicopter tends to stay in the middle of narrow corridors, while its forward velocity is automatically reduced when the obstacle density increases. When confronted with a frontal obstacle, the controller is also able to generate a tight U-turn that ensures the UAV’s survival. The paper provides comparisons with related work, and discusses the applicability of the approach to real platforms.

[1]  Leslie Pack Kaelbling,et al.  Ecological Robotics: Controlling Behavior with Optical Flow , 1995 .

[2]  R. Mansfield,et al.  Analysis of visual behavior , 1982 .

[3]  Gaurav S. Sukhatme,et al.  A comparison of two camera configurations for optic-flow based navigation of a UAV through urban canyons , 2004, 2004 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS) (IEEE Cat. No.04CH37566).

[4]  William H. Warren,et al.  Robot navigation from a Gibsonian viewpoint , 1994, Proceedings of IEEE International Conference on Systems, Man and Cybernetics.

[5]  Ta Camus,et al.  Real-Time Optical Flow , 1995 .

[6]  David J. Fleet,et al.  Performance of optical flow techniques , 1994, International Journal of Computer Vision.

[7]  Michael B. Reiser,et al.  A test bed for insect-inspired robotic control , 2003, Philosophical Transactions of the Royal Society of London. Series A: Mathematical, Physical and Engineering Sciences.

[8]  D. N. Lee The optic flow field: the foundation of vision. , 1980, Philosophical transactions of the Royal Society of London. Series B, Biological sciences.

[9]  Takeo Kanade,et al.  An Iterative Image Registration Technique with an Application to Stereo Vision , 1981, IJCAI.

[10]  Nicolas H. Franceschini,et al.  Neuromorphic optical flow sensing for Nap-of-the-Earth flight , 1999, Optics East.

[11]  F. Barth,et al.  Sensors and Sensing in Biology and Engineering , 2003, Springer Vienna.

[12]  Ted Camus Real-Time Quantized Optical Flow , 1997, Real Time Imaging.

[13]  Pattie Maes,et al.  Maze Navigation Using Optical Flow , 1996 .

[14]  Lee Dn,et al.  The optic flow field: the foundation of vision. , 1980 .

[15]  David Filliat,et al.  Map-based navigation in mobile robots: II. A review of map-learning and path-planning strategies , 2003, Cognitive Systems Research.

[16]  Thomas Netter,et al.  A robotic aircraft that follows terrain using a neuromorphic eye , 2002, IEEE/RSJ International Conference on Intelligent Robots and Systems.

[17]  Ta Camus Calculating time-to-collision with real-time optical flow , 1994, Other Conferences.

[18]  David N. Lee,et al.  Plummeting gannets: a paradigm of ecological optics , 1981, Nature.

[19]  Rodney A. Brooks,et al.  A Robust Layered Control Syste For A Mobile Robot , 2022 .

[20]  David N. Lee,et al.  A Theory of Visual Control of Braking Based on Information about Time-to-Collision , 1976, Perception.

[21]  Heinrich H. Bülthoff,et al.  Biologically motivated visual control of attitude and altitude in translatory flight , 2000 .

[22]  Andrew Duchon Maze Navigation Using Optical Flow , 1996 .

[23]  Nicolas Franceschini,et al.  From Fly Vision to Robot Vision: Re-Construction as a Mode of Discovery , 2003 .

[24]  Jean-Arcady Meyer,et al.  A biomimetic reactive navigation system using the optical flow for a rotary-wing UAV in urban environment , 2004 .

[25]  D. Cliff From animals to animats 3 : proceedings of the Third International Conference on Simulation of Adaptive Behavior , 1994 .

[26]  N. Franceschini,et al.  From insect vision to robot vision , 1992 .

[27]  Svetha Venkatesh,et al.  Robot navigation inspired by principles of insect vision , 1999, Robotics Auton. Syst..

[28]  Javaan Chahl,et al.  Biomimetic Visual Sensing and Flight Control , 2002 .

[29]  S. Zhang,et al.  Range perception through apparent image speed in freely flying honeybees , 1991, Visual Neuroscience.

[30]  P. Anandan,et al.  A computational framework and an algorithm for the measurement of visual motion , 1987, International Journal of Computer Vision.

[31]  Ronald C. Arkin,et al.  An Behavior-based Robotics , 1998 .

[32]  Leslie Pack Kaelbling,et al.  Ecological Robotics , 1998, Adapt. Behav..

[33]  Pedro Cobos Arribas,et al.  FPGA Implementation of Camus Correlation Optical Flow Algorithm for Real Time Images , 2001 .

[34]  Zhang,et al.  Honeybee navigation en route to the goal: visual flight control and odometry , 1996, The Journal of experimental biology.

[35]  David J. Fleet,et al.  Computation of component image velocity from local phase information , 1990, International Journal of Computer Vision.

[36]  R. Hetherington The Perception of the Visual World , 1952 .

[37]  Berthold K. P. Horn,et al.  Determining Optical Flow , 1981, Other Conferences.

[38]  Jean-Arcady Meyer,et al.  Map-based navigation in mobile robots: I. A review of localization strategies , 2003, Cognitive Systems Research.

[39]  J. Gibson The Ecological Approach to Visual Perception , 1979 .

[40]  Aggelos K. Katsaggelos,et al.  Visual communications and image processing '94 , 1994 .

[41]  Heinrich H. Bülthoff,et al.  Behavior-oriented vision for biomimetic flight control , 2002 .

[42]  Stéphane Viollet,et al.  Super-accurate Visual Control of an Aerial Minirobot , 2001 .

[43]  Heinrich H. Bülthoff,et al.  Insect Inspired Visual Control of Translatory Flight , 2001, ECAL.

[44]  Fabrizio Mura,et al.  Visual control of altitude and speed in a flying agent , 1994 .