Deriving Spatial Occupancy Evidence from Radar Detection Data

Central low-level sensor data fusion approaches are getting more popular in advanced driver assistant systems. They allow for the resolution of ambiguities in the retrieval of environmental information on the basis of a large, raw data pool. Hereby, one emerging challenge is the unification of sensor data of different formats and sensor types. A popular intermediate layer of data is given by spatial occupancy grids. The conversion of a discrete list of radar detections, which is a commonly utilized measurement format, is problematic due to the sparse spatial resolution. This work addresses this conversion by interpolating the data spatially using generic sensor model knowledge. Traditional approaches derive occupancy evidence in the vicinity of a detection. In addition, we analyze spatial and kinematic properties derived from Doppler measurements, compute likelihoods that multiple detections are caused by the same object and deduce the space between them accordingly. The incorporation of sensor parameters allows full- and short-range radars to be used generically. In addition, we outline the deduction of free space evidence. The elaborated models and algorithms are evaluated on realworld datasets and discussed w.r.t. their applicability in a subsequent Dempster-Shafer-based sensor data fusion approach.

[1]  Hans-Joachim Wünsche,et al.  A Radar Measurement Model for Extended Object Tracking in Dynamic Scenarios , 2019, 2019 IEEE Intelligent Vehicles Symposium (IV).

[2]  Hans-Joachim Wünsche,et al.  Radar reflection characteristics of vehicles for contour and feature estimation , 2017, 2017 Sensor Data Fusion: Trends, Solutions, Applications (SDF).

[3]  Véronique Berge-Cherfaoui,et al.  Map-Aided Evidential Grids for Driving Scene Understanding , 2015, IEEE Intelligent Transportation Systems Magazine.

[4]  Dirk Wollherr,et al.  Evidential Grid-Based Tracking and Mapping , 2017, IEEE Transactions on Intelligent Transportation Systems.

[5]  Dirk Wollherr,et al.  Grid-Based Environment Estimation Using Evidential Mapping and Particle Tracking , 2018, IEEE Transactions on Intelligent Vehicles.

[6]  Michael Himmelsbach,et al.  Multi-modal local terrain maps from vision and LiDAR , 2017, 2017 IEEE Intelligent Vehicles Symposium (IV).

[7]  Hans-Joachim Wünsche,et al.  Heterogeneous multi-sensor fusion for extended objects in automotive scenarios using Gaussian processes and a GMPHD-filter , 2017, 2017 Sensor Data Fusion: Trends, Solutions, Applications (SDF).

[8]  Klaus C. J. Dietmayer,et al.  Tracking of Extended Objects with High-Resolution Doppler Radar , 2016, IEEE Transactions on Intelligent Transportation Systems.

[9]  Philipp Berthold,et al.  An Abstracted Radar Measurement Model for Extended Object Tracking , 2018, 2018 21st International Conference on Intelligent Transportation Systems (ITSC).

[10]  Arthur P. Dempster,et al.  A Generalization of Bayesian Inference , 1968, Classic Works of the Dempster-Shafer Theory of Belief Functions.

[11]  Martin Buss,et al.  Grid-based mapping and tracking in dynamic environments using a uniform evidential environment representation , 2014, 2014 IEEE International Conference on Robotics and Automation (ICRA).

[12]  D. Salmond,et al.  Spatial distribution model for tracking extended objects , 2005 .

[13]  Ulrich Hofmann,et al.  Fusion of occupancy grid mapping and model based object tracking for driver assistance systems using laser and radar sensors , 2010, 2010 IEEE Intelligent Vehicles Symposium.

[14]  Franz Korf,et al.  Object tracking and dynamic estimation on evidential grids , 2014, 17th International IEEE Conference on Intelligent Transportation Systems (ITSC).

[15]  Daniel Alexander Meissner,et al.  A multiple model PHD approach to tracking of cars under an assumed rectangular shape , 2014, 17th International Conference on Information Fusion (FUSION).