A Driving Robot For Autonomous Vehicles On Extreme Courses

Abstract This contribution is concerned with a novel robot system for autonomous vehicle guidance in standard passenger cars. It transmits real-time video, sensor and control data to a remote control station. This station allows remote monitoring of the system state, update of mission commands and overriding of the autonomous system function by remote manual control. The underlying concept for the autonomous system arranges sensors, control and actors in a redundant manner, thus yielding a high level of system reliability through plausibility checks. A multisensor perception platform comprises DGPS/INS, laser scanners, stereo vision and radar. It provides information about the vehicle's ego-position, lane geometry, and obstacles around the vehicle. Plausibility and reliability of the sensorial data is assessed in a sensor data fusion unit. Path planning and vehicle control are accompanied by an 'electronic copilot'. It intervenes in critical situations and yields a safe stand still of the vehicle. The actors are 'seated' on the driver seat and engage brake, accelerator, clutch, gear shift and steering wheel similarly to the legs and arms of a human driver. The modular setup clearly separates the standard vehicle from the autonomous system thus allowing fast equipment of standard passenger cars. System performance is demonstrated by experimental results on assort courses designed for durability tests of passenger cars.