Sensor and navigation data fusion for an autonomous vehicle

Describes an approach for different data fusion tasks in an autonomous vehicle. One fusion system is designed for the fusion of data from the object-detecting sensors of the vehicle in order to increase the accuracy and to reduce the large amount of sensor data. Another system is concerned with the fusion of navigation data to obtain accurate information on the vehicle state. It is based on information provided by the ego-position sensors of the vehicle as well as on the object-detecting sensors. The output of both fusion systems is used by the vehicle guidance system to determine the desired path of motion for the autonomous vehicle.

[1]  Christoph Stiller,et al.  Multisensor obstacle detection and tracking , 2000, Image Vis. Comput..

[2]  Y. Bar-Shalom Tracking and data association , 1988 .

[3]  Jan Becker,et al.  Vehicle guidance for an autonomous vehicle , 1999, Proceedings 199 IEEE/IEEJ/JSAI International Conference on Intelligent Transportation Systems (Cat. No.99TH8383).

[4]  Hubert Weisser,et al.  AUTONOMOUS DRIVING ON VEHICLE TEST TRACKS : OVERVIEW, MOTIVATION, AND CONCEPT , 1998 .

[5]  Simon Andreas,et al.  Navigation and Control of an Autonomous Vehicle , 2000 .

[6]  Mohinder S. Grewal,et al.  Kalman Filtering: Theory and Practice , 1993 .

[7]  Bernd Hürtgen,et al.  Lane following combining vision and DGPS , 2000, Image Vis. Comput..

[8]  J. C. Becker Fusion of data from the object-detecting sensors of an autonomous vehicle , 1999, Proceedings 199 IEEE/IEEJ/JSAI International Conference on Intelligent Transportation Systems (Cat. No.99TH8383).

[9]  H. Weisser,et al.  Autonomous driving on vehicle test tracks: overview, implementation and vehicle diagnosis , 1999, Proceedings 199 IEEE/IEEJ/JSAI International Conference on Intelligent Transportation Systems (Cat. No.99TH8383).