A multirange architecture for collision-free off-road robot navigation

We present a multilayered mapping, planning, and command execution system developed and tested on the LAGR mobile robot. Key to robust performance under uncertainty is the combination of a short-range perception system operating at high frame rate and low resolution and a long-range, adaptive vision system operating at lower frame rate and higher resolution. The short-range module performs local planning and obstacle avoidance with fast reaction times, whereas the long-range module performs strategic visual planning. Probabilistic traversability labels provided by the perception modules are combined and accumulated into a robot-centered hyperbolic-polar map with a 200-m effective range. Instead of using a dynamical model of the robot for short-range planning, the system uses a large lookup table of physically possible trajectory segments recorded on the robot in a wide variety of driving conditions. Localization is performed using a combination of global positioning system, wheel odometry, inertial measurement unit, and a high-speed, low-complexity rotational visual odometry module. The end-to-end system was developed and tested on the LAGR mobile robot and was verified in independent government tests. © 2008 Wiley Periodicals, Inc.

[1]  J. Baird,et al.  The locus of environmental attention , 1981 .

[2]  Robert C. Bolles,et al.  Random sample consensus: a paradigm for model fitting with applications to image analysis and automated cartography , 1981, CACM.

[3]  Susumu Tachi,et al.  A Method for Autonomous Locomotion of Mobile Robots , 1984 .

[4]  Rodney A. Brooks,et al.  A Robust Layered Control Syste For A Mobile Robot , 2022 .

[5]  D.J. Kriegman,et al.  Stereo vision and navigation in buildings for mobile robots , 1989, IEEE Trans. Robotics Autom..

[6]  G. Oriolo,et al.  Fuzzy Maps: A New Tool for Mobile Robot Perception and Planning , 1997 .

[7]  Alonzo Kelly,et al.  Stereo Vision Enhancements for Low-Cost Outdoor Autonomous Vehicles , 1998 .

[8]  Hugh F. Durrant-Whyte,et al.  A high integrity IMU/GPS navigation loop for autonomous land vehicle applications , 1999, IEEE Trans. Robotics Autom..

[9]  Bill Triggs,et al.  Camera pose and calibration from 4 or 5 known 3D points , 1999, Proceedings of the Seventh IEEE International Conference on Computer Vision.

[10]  James P. Ostrowski,et al.  Visual motion planning for mobile robots , 2002, IEEE Trans. Robotics Autom..

[11]  Alonzo Kelly,et al.  Reactive Nonholonomic Trajectory Generation via Parametric Optimal Control , 2003, Int. J. Robotics Res..

[12]  Munther A. Dahleh,et al.  Maneuver-based motion planning for nonlinear systems with symmetries , 2005, IEEE Transactions on Robotics.

[13]  Frederik Schaffalitzky,et al.  How hard is 3-view triangulation really? , 2005, Tenth IEEE International Conference on Computer Vision (ICCV'05) Volume 1.

[14]  Roberto Manduchi,et al.  Obstacle Detection and Terrain Classification for Autonomous Off-Road Navigation , 2005, Auton. Robots.

[15]  Alonzo Kelly,et al.  Toward Reliable Off Road Autonomous Vehicles Operating in Challenging Environments , 2006, Int. J. Robotics Res..

[16]  James S. Albus,et al.  Learning in a hierarchical control system: 4D/RCS in the DARPA LAGR program , 2006, J. Field Robotics.

[17]  M. Veth,et al.  Stochastic constraints for efficient image correspondence search , 2006, IEEE Transactions on Aerospace and Electronic Systems.

[18]  Steven Dubowsky,et al.  Hazard avoidance for high‐speed mobile robots in rough terrain , 2006, J. Field Robotics.

[19]  J. Borenstein,et al.  Current-Based Slippage Detection and Odometry Correction for Mobile Robots and Planetary Rovers , 2006, IEEE Transactions on Robotics.

[20]  Karsten Berns,et al.  Real-Time Visual Self-Localisation in Dynamic Environments , 2007, AMS.

[21]  Larry H. Matthies,et al.  Two years of Visual Odometry on the Mars Exploration Rovers , 2007, J. Field Robotics.

[22]  Urs A. Muller,et al.  Learning long-range vision for autonomous off-road driving , 2009 .