Wide-Field Integration Methods for Autonomous Navigation in 3-D Environments

Wide-Field Integration (WFI) of optic ∞ow, a method for information extraction modeled on the spatial decompositions of specialized interneurons in the insect visuomotor system, can provide useful information about the proximity and relative speed of a vehicle with respect to objects in the environment. In this paper, WFI methods are extended to a 6-DOF UAV and used to demonstrate simultaneous stabilization of obstacle avoidance and terrain following with 2-D optic ∞ow. Measurements are taken in 3 planes (yaw, pitch, roll) of the spherical ∞ow fleld and decomposed spatially into Fourier harmonics. By writing a mathematical description of the 3-D environments that are likely to be encountered, the explicit linearized state data contained in each harmonic is found. By selecting harmonics whose information content is robust to changes in obstacle environment a control law was written that specifles how the optic ∞ow data is decomposed into input for each actuator. LQR techniques were used to produce feedback gains that were ultimately employed in an output feedback structure. Stability is proved via closed-loop eigenvalues and the UAV was simulated in an urban canyon-like environment. The aircraft was able to navigate the obstacle fleld, including a 90 bend, and maintain a flxed altitude above ground.

[1]  Alexander Borst,et al.  Principles of visual motion detection , 1989, Trends in Neurosciences.

[2]  C. David Compensation for height in the control of groundspeed byDrosophila in a new, ‘barber's pole’ wind tunnel , 1982, Journal of comparative physiology.

[3]  A. Borst,et al.  Neural networks in the cockpit of the fly , 2002, Journal of Comparative Physiology A.

[4]  James Sean Humbert,et al.  Experimental validation of wide-field integration methods for autonomous navigation , 2007, 2007 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[5]  Klaus Hausen,et al.  Motion sensitive interneurons in the optomotor system of the fly , 1982, Biological Cybernetics.

[6]  A. Borst,et al.  Motion computation and visual orientation in flies. , 1993, Comparative biochemistry and physiology. Comparative physiology.

[7]  Zhang,et al.  Honeybee navigation en route to the goal: visual flight control and odometry , 1996, The Journal of experimental biology.

[8]  R Hengstenberg,et al.  Dendritic structure and receptive-field organization of optic flow processing interneurons in the fly. , 1998, Journal of neurophysiology.

[9]  Clark N. Taylor,et al.  Obstacle Avoidance For Unmanned Air Vehicles Using Image Feature Tracking , 2006 .

[10]  R. Hengstenberg,et al.  The number and structure of giant vertical cells (VS) in the lobula plate of the blowflyCalliphora erythrocephala , 1982, Journal of comparative physiology.

[11]  N. Franceschini,et al.  A Bio-Inspired Flying Robot Sheds Light on Insect Piloting Abilities , 2007, Current Biology.

[12]  Richard M. Murray,et al.  Sensorimotor convergence in visual navigation and flight control systems , 2005 .

[13]  Michael H Dickinson,et al.  The influence of visual landscape on the free flight behavior of the fruit fly Drosophila melanogaster. , 2002, The Journal of experimental biology.

[14]  Richard M. Murray,et al.  Pitch-Altitude Control and Terrain Following Based on Bio-Inspired Visuomotor Convergence , 2005 .

[15]  Dario Floreano,et al.  Fly-inspired visual steering of an ultralight indoor aircraft , 2006, IEEE Transactions on Robotics.

[16]  Holger G. Krapp,et al.  Neural encoding of behaviourally relevant visual-motion information in the fly , 2002, Trends in Neurosciences.

[17]  M. Srinivasan,et al.  A complete panoramic vision system, incorporating imaging, ranging, and three dimensional navigation , 2000, Proceedings IEEE Workshop on Omnidirectional Vision (Cat. No.PR00704).

[18]  K. Hausen Motion sensitive interneurons in the optomotor system of the fly , 1982, Biological Cybernetics.