Visual 3D target tracking for autonomous vehicles

In this paper, an algorithm is proposed to identify and track moving objects for the autonomous vehicles target following application. It is a difficult problem because both the targets and the cameras are moving. Here, optical flow fields, color features, stereo pair disparity are used as visual features while vehicles' motion sensors are used to determine the camera motion. Then this paper proposes a data fusion algorithm which integrates information obtained from different visual cues and the camera motion sensor data. The fusion algorithm determines the speed and relative position of the interested target in the 3D world coordinate for the vehicle to track. This paper presents a detailed description of the three-dimensional (3D) target tracking algorithm using an extended Kalman filter. Experimental results are presented to demonstrate the performance of the proposed scheme using different image sequences

[1]  David J. Fleet,et al.  Performance of optical flow techniques , 1994, International Journal of Computer Vision.

[2]  Takahiro Ishikawa,et al.  The template update problem , 2004, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[3]  Jean Ponce,et al.  Computer Vision: A Modern Approach , 2002 .

[4]  Christoph Stiller,et al.  Fusing optical flow and stereo disparity for object tracking , 2002, Proceedings. The IEEE 5th International Conference on Intelligent Transportation Systems.

[5]  Yang Fan,et al.  Optical flow based speed estimation in AUV target tracking , 2001, MTS/IEEE Oceans 2001. An Ocean Odyssey. Conference Proceedings (IEEE Cat. No.01CH37295).

[6]  Reinhard Männer,et al.  Calculating Dense Disparity Maps from Color Stereo Images, an Efficient Implementation , 2004, International Journal of Computer Vision.

[7]  Yoshiaki Shirai,et al.  Person tracking by integrating optical flow and uniform brightness regions , 1998, Proceedings. 1998 IEEE International Conference on Robotics and Automation (Cat. No.98CH36146).

[8]  Yoshiaki Shirai,et al.  Tracking a person with 3-D motion by integrating optical flow and depth , 2000, Proceedings Fourth IEEE International Conference on Automatic Face and Gesture Recognition (Cat. No. PR00580).

[9]  Berthold K. P. Horn Robot vision , 1986, MIT electrical engineering and computer science series.

[10]  Avinash C. Kak,et al.  Vision for Mobile Robot Navigation: A Survey , 2002, IEEE Trans. Pattern Anal. Mach. Intell..

[11]  Alexander Zelinsky,et al.  Development of a visually-guided autonomous underwater vehicle , 1998, IEEE Oceanic Engineering Society. OCEANS'98. Conference Proceedings (Cat. No.98CH36259).

[12]  Y. J. Tejwani,et al.  Robot vision , 1989, IEEE International Symposium on Circuits and Systems,.

[13]  Bir Bhanu,et al.  Multistrategy fusion using mixture model for moving object detection , 2001, Conference Documentation International Conference on Multisensor Fusion and Integration for Intelligent Systems. MFI 2001 (Cat. No.01TH8590).