This paper describes route-learning experiments with an autonomous mobile robot in which map building is achieved through a process of unsupervised clustering of sensory data. The resulting topological mapping of the robot's perceptual space is used for subsequent navigation tasks such as route following. After the autonomous mapbuilding process is completed, the acquired generalised perceptions are associated with motor actions, enabling the robot to follow routes autonomously. The navigation system has been tested extensively on a Nomad 200 mobile robot, it is reliable and copes with noise and variation inherent in the environment. One important aspect of the map building and route following system described here is that relevance or irrelevance of perceptual features is determined autonomously by the robot, not through predefinition by the designer. Secondly, the presented route learning system enables the robot to use the map for association of perception with action, rather than localisation alone.
[1]
H. Barlow.
Vision: A computational investigation into the human representation and processing of visual information: David Marr. San Francisco: W. H. Freeman, 1982. pp. xvi + 397
,
1983
.
[2]
N Oreskes,et al.
Verification, Validation, and Confirmation of Numerical Models in the Earth Sciences
,
1994,
Science.
[3]
Jukka Heikkonen.
Subsymbolic representations, self-organizing maps, and object motion learning
,
1994
.
[4]
John Canny,et al.
The complexity of robot motion planning
,
1988
.
[5]
Teuvo Kohonen,et al.
Self-Organization and Associative Memory
,
1988
.
[6]
Ulrich Nehmzow,et al.
Using Motor Actions for Location Recognition
,
1991
.
[7]
P. Kampmann,et al.
Indoor Navigation of Mobile Robots by Use of Learned Maps
,
1991
.
[8]
Milan Sonka,et al.
Image Processing, Analysis and Machine Vision
,
1993,
Springer US.