Autonomous driving in urban environments: Boss and the Urban Challenge

Boss is an autonomous vehicle that uses on-board sensors (global positioning system, lasers, radars, and cameras) to track other vehicles, detect static obstacles, and localize itself relative to a road model. A three-layer planning system combines mission, behavioral, and motion planning to drive in urban environments. The mission planning layer considers which street to take to achieve a mission goal. The behavioral layer determines when to change lanes and precedence at intersections and performs error recovery maneuvers. The motion planning layer selects actions to avoid obstacles while making progress toward local goals. The system was developed from the ground up to address the requirements of the DARPA Urban Challenge using a spiral system development process with a heavy emphasis on regular, regressive system testing. During the National Qualification Event and the 85-km Urban Challenge Final Event, Boss demonstrated some of its capabilities, qualifying first and winning the challenge. © 2008 Wiley Periodicals, Inc.

[1]  Maxim Likhachev,et al.  Motion planning in urban environments , 2008 .

[2]  C. Urmson,et al.  Classification and tracking of dynamic objects with multiple sensors for autonomous driving in urban environments , 2008, 2008 IEEE Intelligent Vehicles Symposium.

[3]  Din-Chang Tseng,et al.  A wavelet-based multiresolution edge detection and tracking , 2005, Image Vis. Comput..

[4]  Paul E. Rybski,et al.  An Adaptive Model Switching Approach for a Multisensor Tracking System used for Autonomous Driving in an Urban Environment , 2008 .

[5]  Rodney A. Brooks,et al.  A Robust Layered Control Syste For A Mobile Robot , 2022 .

[6]  Sebastian Thrun,et al.  Anytime Dynamic A*: An Anytime, Replanning Algorithm , 2005, ICAPS.

[7]  Ingrid Daubechies,et al.  Ten Lectures on Wavelets , 1992 .

[8]  Richard O. Duda,et al.  Use of the Hough transformation to detect lines and curves in pictures , 1972, CACM.

[9]  William Whittaker,et al.  A robust approach to high‐speed navigation for unrehearsed desert terrain , 2007 .

[10]  Robert A. MacLachlan,et al.  Tracking Moving Objects From a Moving Vehicle Using a Laser Scanner , 2006 .

[11]  Alonzo Kelly,et al.  Optimal Rough Terrain Trajectory Generation for Wheeled Mobile Robots , 2007, Int. J. Robotics Res..

[12]  Paul A. Viola,et al.  Robust Real-Time Face Detection , 2001, International Journal of Computer Vision.

[13]  H. Winner,et al.  A modular system architecture for sensor data processing of ADAS applications , 2005, IEEE Proceedings. Intelligent Vehicles Symposium, 2005..

[14]  Sebastian Thrun,et al.  Stanley: The robot that won the DARPA Grand Challenge , 2006, J. Field Robotics.

[15]  Maxim Likhachev,et al.  Motion planning in urban environments: Part I , 2008, 2008 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[16]  William Whittaker,et al.  A robust approach to high‐speed navigation for unrehearsed desert terrain , 2006, J. Field Robotics.

[17]  Robert A. MacLachlan,et al.  Tracking of Moving Objects from a Moving Vehicle Using a Scanning Laser Rangefinder , 2006, 2006 IEEE Intelligent Transportation Systems Conference.

[18]  K.C.J. Dietmayer,et al.  IMM object tracking for high dynamic driving maneuvers , 2004, IEEE Intelligent Vehicles Symposium, 2004.

[19]  Michael Darms Eine Basis-Systemarchitektur zur Sensordatenfusion von Umfeldsensoren für Fahrerassistenzsysteme , 2007 .