Adaptive Homing—Robotic Exploration Tours

In this article, a minimalistic model for learning and adaptation of visual homing is presented. Normalized Hebbian learning is used during exploration tours of a mobile robot to learn visual homing and to adapt to the sensory modalities. The sensors of the mobile robot (omnidirectional camera, magnetic compass) have been chosen in a way that their data most closely resemble the sensory data at the disposal of insects such as the desert ant Cataglyphis (almost omnidirectional vision, polarized light compass), which is an amazing navigator despite its tiny brain. The learned homing mechanism turned out to be closely related to Lambrinos and colleagues' average landmark vector (ALV) model and is widely independent of any special features of the environment. In contrast to the ALV model or other models of visual homing, feature extraction or landmark segmentation is not necessary. Mobile robot experiments have been performed in an unmodified office environment to test the feasibility of learning of visual homing.

[1]  Bernhard Schölkopf,et al.  Learning View Graphs for Robot Navigation , 1997, AGENTS '97.

[2]  Bernhard Schölkopf,et al.  Where did I take that snapshot? Scene-based homing by image matching , 1998, Biological Cybernetics.

[3]  A. P. Georgopoulos,et al.  Neuronal population coding of movement direction. , 1986, Science.

[4]  P. Milner,et al.  Preconceptions and prerequisites: Understanding the function of synaptic plasticity will also depend on a better systems-level understanding of the multiple types of memory , 1997, Behavioral and Brain Sciences.

[5]  R Möller,et al.  Do insects use templates or parameters for landmark navigation? , 2001, Journal of theoretical biology.

[6]  J. Knott The organization of behavior: A neuropsychological theory , 1951 .

[7]  Ralf Möller,et al.  Insect visual homing strategies in a robot with analog processing , 2000, Biological Cybernetics.

[8]  R. Morris Spatial Localization Does Not Require the Presence of Local Cues , 1981 .

[9]  R. Wehner Polarization vision--a uniform sensory capacity? , 2001, The Journal of experimental biology.

[10]  R. Pfeifer,et al.  A mobile robot employing insect strategies for navigation , 2000, Robotics Auton. Syst..

[11]  Fukushi,et al.  Optical scaling in conspecific Cataglyphis ants , 1995, The Journal of experimental biology.

[12]  Ralf Möller,et al.  Learning of Visual Navigation Strategies , 2001 .

[13]  Thomas S. Collett,et al.  Landmark learning and guidance in insects , 1992 .

[14]  Jean-Arcady Meyer,et al.  BIOLOGICALLY BASED ARTIFICIAL NAVIGATION SYSTEMS: REVIEW AND PROSPECTS , 1997, Progress in Neurobiology.

[15]  M. Srinivasan,et al.  Reflective surfaces for panoramic imaging. , 1997, Applied optics.

[16]  Hiroshi Kobayashi,et al.  An Autonomous Agent Navigating with a Polarized Light Compass , 1997, Adapt. Behav..

[17]  Thomas S. Collett,et al.  How do insects use path integration for their navigation? , 2000, Biological Cybernetics.

[18]  E. Oja Simplified neuron model as a principal component analyzer , 1982, Journal of mathematical biology.