To perform planetary exploration without human supervision, a completely autonomous robot must be able to model its environment and to locate itself while exploring its surroundings. For that purpose, the authors propose a modular perception system for an autonomous explorer. The perception system maintains a consistent internal representation of the observed terrain from multiple sensor views. The representation can be accessed from other modules through queries. The perception system is intended to be used by the Ambler, a six-legged vehicle being built at the authors' university. A partial implementation of the system using a range scanner is presented as well as experimental results on a testbed that includes the sensor, one computer-controlled leg, and obstacles on a sandy surface.<<ETX>>
[1]
Takeo Kanade,et al.
Methods for Identifying Footfall Positions for a Legged Robot
,
1989,
Proceedings. IEEE/RSJ International Workshop on Intelligent Robots and Systems '. (IROS '89) 'The Autonomous Mobile Robots and Its Applications.
[2]
Takeo Kanade,et al.
3-D Vision Tech-niques for Autonomous Vehicles
,
1988
.
[3]
Reid Simmons,et al.
Experience with a Task Control Architecture for Mobile Robots
,
1989
.
[4]
Takeo Kanade,et al.
Ambler: an autonomous rover for planetary exploration
,
1989,
Computer.
[5]
Anthony Stentz,et al.
Against Complex Architectures
,
1989,
Proceedings of the 6th International Symposium on Unmanned Untethered Submersible Technology,.