Collision sensing by stereo vision and radar sensor fusion

To take the advantages of both stereo cameras and radar, this paper proposes a fusion approach to accurately estimate the location, size, pose and motion information of a threat vehicle with respect to the host from observations obtained by both sensors. To do that, we first fit the contour of a threat vehicle from stereo depth information, and find the closest point on the contour from the vision sensor. Then the fused closest point is obtained by fusing radar observations and the vision closest point. Next by translating the fitted contour to the fused closest point, the fused contour is obtained. Finally the fused contour is tracked by using the rigid body constraints to estimate the location, size, pose and motion of the threat vehicle. Experimental results from both the synthetic data and the real world road test data demonstrate the success of the proposed algorithm.

[1]  E. D. Dickmanns,et al.  The development of machine vision for road vehicles in the last decade , 2002, Intelligent Vehicle Symposium, 2002. IEEE.

[2]  C. Laurgeau,et al.  Fade: a vehicle detection and tracking system featuring monocular color vision and radar data fusion , 2002, Intelligent Vehicle Symposium, 2002. IEEE.

[3]  Yang Chen,et al.  A fusion system for real-time forward collision warning in automobiles , 2003, Proceedings of the 2003 IEEE International Conference on Intelligent Transportation Systems.

[4]  Charles E. Thorpe,et al.  DETECTION OF SMALL OBSTACLES AT LONG RANGE USING MULTIBASELINE STEREO , 1998 .

[5]  Paul Levi,et al.  Robust vehicle tracking fusing radar and vision , 2001, Conference Documentation International Conference on Multisensor Fusion and Integration for Intelligent Systems. MFI 2001 (Cat. No.01TH8590).

[6]  Y. Tamatsu,et al.  Solid or not solid: vision for radar target validation , 2004, IEEE Intelligent Vehicles Symposium, 2004.

[7]  K. Saneyoshi Drive assist system using stereo image recognition , 1996, Proceedings of Conference on Intelligent Vehicles.

[8]  Samuel S. Blackman,et al.  Design and Analysis of Modern Tracking Systems , 1999 .

[9]  Paul Levi,et al.  Advanced lane recognition-fusing vision and radar , 2000, Proceedings of the IEEE Intelligent Vehicles Symposium 2000 (Cat. No.00TH8511).

[10]  Yajun Fang,et al.  Depth-based target segmentation for intelligent vehicles: fusion of radar and binocular stereo , 2002, IEEE Trans. Intell. Transp. Syst..

[11]  James Llinas,et al.  An introduction to multisensor data fusion , 1997, Proc. IEEE.

[12]  T. Harada,et al.  Phase-comparison monopulse radar with switched transmit beams for automotive application , 1999, 1999 IEEE MTT-S International Microwave Symposium Digest (Cat. No.99CH36282).

[13]  U. Kiencke,et al.  Standard platform for sensor fusion on advanced driver assistance system using Bayesian Network , 2004, IEEE Intelligent Vehicles Symposium, 2004.

[14]  S. Tokoro Automotive application systems of a millimeter-wave radar , 1996, Proceedings of Conference on Intelligent Vehicles.

[15]  Feng Han,et al.  A Radar Guided Vision System for Vehicle Validation and Vehicle Motion Characterization , 2007, 2007 IEEE Intelligent Transportation Systems Conference.

[16]  Sadayuki Tsugawa,et al.  Vision-based vehicles in Japan: machine vision systems and driving control systems , 1994, IEEE Trans. Ind. Electron..

[17]  Takeo Kato,et al.  An obstacle detection method by fusion of radar and motion stereo , 2002, SICE 2003 Annual Conference (IEEE Cat. No.03TH8734).

[18]  Peng Chang,et al.  Stereo-based vision system for automotive imminent collision detection , 2004, IEEE Intelligent Vehicles Symposium, 2004.