Precision tracking with sparse 3D and dense color 2D data

Precision tracking is important for predicting the behavior of other cars in autonomous driving. We present a novel method to combine laser and camera data to achieve accurate velocity estimates of moving vehicles. We combine sparse laser points with a high-resolution camera image to obtain a dense colored point cloud. We use a color-augmented search algorithm to align the dense color point clouds from successive time frames for a moving vehicle, thereby obtaining a precise estimate of the tracked vehicle's velocity. Using this alignment method, we obtain velocity estimates at a much higher accuracy than previous methods. Through pre-filtering, we are able to achieve near real time results. We also present an online method for real-time use with accuracies close to that of the full method. We present a novel approach to quantitatively evaluate our velocity estimates by tracking a parked car in a local reference frame in which it appears to be moving relative to the ego vehicle. We use this evaluation method to automatically quantitatively evaluate our tracking performance on 466 separate tracked vehicles. Our method obtains a mean absolute velocity error of 0.27 m/s and an RMS error of 0.47 m/s on this test set. We can also qualitatively evaluate our method by building color 3D car models from moving vehicles. We have thus demonstrated that our method can be used for precision car tracking with applications to autonomous driving and behavior modeling.

[1]  Sebastian Thrun,et al.  Unsupervised Calibration for Multi-beam Lasers , 2010, ISER.

[2]  Radu Bogdan Rusu,et al.  3D is here: Point Cloud Library (PCL) , 2011, 2011 IEEE International Conference on Robotics and Automation.

[3]  Nicolai Wojke,et al.  Moving vehicle detection and tracking in unstructured environments , 2012, 2012 IEEE International Conference on Robotics and Automation.

[4]  Klaus Dietmayer,et al.  3D vehicle detection using a laser scanner and a video camera , 2008 .

[5]  Roland Siegwart,et al.  Generative object detection and tracking in 3D range data , 2012, 2012 IEEE International Conference on Robotics and Automation.

[6]  S. Kolski,et al.  Detection, prediction, and avoidance of dynamic obstacles in urban environments , 2008, 2008 IEEE Intelligent Vehicles Symposium.

[7]  Hans-Joachim Wünsche,et al.  Monocular model-based 3D vehicle tracking for autonomous vehicles in unstructured environment , 2011, 2011 IEEE International Conference on Robotics and Automation.

[8]  D. Streller,et al.  Vehicle and object models for robust tracking in traffic scenes using laser range images , 2002, Proceedings. The IEEE 5th International Conference on Intelligent Transportation Systems.

[9]  Tucker R. Balch,et al.  The multi‐iterative closest point tracker: An online algorithm for tracking multiple interacting targets , 2012, J. Field Robotics.

[10]  Sebastian Thrun,et al.  Model based vehicle detection and tracking for autonomous urban driving , 2009, Auton. Robots.

[11]  Jean-Charles Noyer,et al.  Model-based detection and tracking of vehicle using a scanning laser rangefinder: A particle filtering approach , 2012, 2012 IEEE Intelligent Vehicles Symposium.

[12]  Olivier Aycard,et al.  Detection, classification and tracking of moving objects in a 3D environment , 2012, 2012 IEEE Intelligent Vehicles Symposium.

[13]  Sebastian Thrun,et al.  Towards 3D object recognition via classification of arbitrary object tracks , 2011, 2011 IEEE International Conference on Robotics and Automation.

[14]  Paul Newman,et al.  Image and Sparse Laser Fusion for Dense Scene Reconstruction , 2009, FSR.

[15]  Sebastian Thrun,et al.  Upsampling range data in dynamic environments , 2010, 2010 IEEE Computer Society Conference on Computer Vision and Pattern Recognition.

[16]  Sebastian Thrun,et al.  A Noise‐aware Filter for Real‐time Depth Upsampling , 2008 .

[17]  Hao Men,et al.  Color point cloud registration with 4D ICP algorithm , 2011, 2011 IEEE International Conference on Robotics and Automation.