Navigation in an autonomous flying robot by using a biologically inspired visual odometer

While mobile robots and walking insects can use proprioceptive information (specialized receptors in the insects' leg, or wheel encoders in robots) to estimate distance traveled, flying agents have to rely mainly on visual cues. Experiments with bees provide evidence that flying insects might be using optical flow induced by egomotion to estimate distance traveled. Recently some details of this odometer have been unraveled. In this study, we propose a biologically inspired model of the bee's visual odometer based on Elementary Motion Detectors (EMDs), and present results from goal-directed navigation experiments with an autonomous flying robot platform that we developed specifically for this purpose. The robot is equipped with a panoramic vision system, which is used to provide input to the EMDs of the left and right visual fields. The outputs of the EMDs are in later stage spatially integrated by wide field motion detectors, and their accumulated response is directly used for the odometer. In a set of initial experiments, the robot moves through a corridor on a fixed route, and the outputs of EMDs, the odometer, are recorded. The results show that the proposed model can be used to provide an estimate of the distance traveled, but the performance depends on the route the robot follows, something which is biologically plausible since natural insects tend to adopt a fixed route during foraging. Given these results, we assumed that the optomotor response plays an important role in the context of goal-directed navigation, and we conducted experiments with an autonomous freely flying robot. The experiments demonstrate that this computationally cheap mechanism can be successfully employed in natural indoor environments.

[1]  M. Srinivasan,et al.  Reflective surfaces for panoramic imaging. , 1997, Applied optics.

[2]  A. Borst,et al.  Detecting visual motion: theory and models. , 1993, Reviews of oculomotor research.

[3]  Carol Grant Gould,et al.  The Honey Bee , 1988 .

[4]  Mandyam V. Srinivasan,et al.  Robot Navigation by Visual Dead-Reckoning Inspiration From Insects , 1997, Int. J. Pattern Recognit. Artif. Intell..

[5]  Fabrizio Mura,et al.  Visual control of altitude and speed in a flying agent , 1994 .

[6]  Samuel Rossel,et al.  Navigation by bees using polarized skylight , 1993 .

[7]  Zhang,et al.  Honeybee navigation en route to the goal: visual flight control and odometry , 1996, The Journal of experimental biology.

[8]  Mandyam V. Srinivasan,et al.  Motion detection in insect orientation and navigation , 1999, Vision Research.

[9]  George A. Bekey,et al.  The USC autonomous flying vehicle: an experiment in real-time behavior-based control , 1993, [1993] Proceedings IEEE International Conference on Robotics and Automation.

[10]  Alexa Riehle,et al.  Directionally Selective Motion Detection by Insect Neurons , 1989 .

[11]  M. Srinivasan,et al.  Range perception through apparent image speed in freely flying honeybees , 1991, Visual Neuroscience.

[12]  M V Srinivasan,et al.  Honeybee navigation: nature and calibration of the "odometer". , 2000, Science.

[13]  Takeo Kanade,et al.  A visual odometer for autonomous helicopter flight , 1999, Robotics Auton. Syst..

[14]  Christof Koch,et al.  A Neuromorphic Visual Motion Sensor For Real-World Robots , 1998 .

[15]  Nicolas H. Franceschini,et al.  Towards UAV Nap-of-the-Earth Flight Using Optical Flow , 1999, ECAL.

[16]  W. Reichardt Movement perception in insects , 1969 .

[17]  F. A. Miles,et al.  Visual Motion and Its Role in the Stabilization of Gaze , 1992 .

[18]  N. Franceschini,et al.  From insect vision to robot vision , 1992 .

[19]  Zhang,et al.  Visually mediated odometry in honeybees , 1997, The Journal of experimental biology.