Efficient tracking with the Bounded Hough Transform

The Bounded Hough Transform is introduced to track objects in a sequence of sparse range images. The method is based upon a variation of the General Hough Transform that exploits the coherence across image frames that results from the relationship between known bounds on the object's velocity and the sensor frame rate. It is extremely efficient, running in O(N) for N range data points, and effectively trades off localization precision for runtime efficiency. The method has been implemented and tested on a variety of objects, including freeform surfaces, using both simulated and real data from Lidar and stereovision sensors. The motion bounds allow the inter-frame transformation space to be reduced to a reasonable, and indeed small size, containing only 729 possible states. In a variation, the rotational subspace is projected onto the translational subspace, which further reduces the transformation space to only 54 states. Experimental results confirm that the technique works well with very sparse data, possibly comprising only tens of points per frame, and that it is also robust to measurement error and outliers.

[1]  Paul J. Besl,et al.  A Method for Registration of 3-D Shapes , 1992, IEEE Trans. Pattern Anal. Mach. Intell..

[2]  Josef Kittler,et al.  A survey of the hough transform , 1988, Comput. Vis. Graph. Image Process..

[3]  Mark S. Nixon,et al.  Dynamic feature extraction via the velocity Hough transform , 1997, Pattern Recognit. Lett..

[4]  Sergio A. Velastin,et al.  The Mahalanobis distance Hough transform with extended Kalman filter refinement , 1994, Proceedings of IEEE International Symposium on Circuits and Systems - ISCAS '94.

[5]  Guy Godin,et al.  Recursive model optimization using ICP and free moving 3D data acquisition , 2003, Fourth International Conference on 3-D Digital Imaging and Modeling, 2003. 3DIM 2003. Proceedings..

[6]  Gerhard Roth,et al.  Pose Determination and Tracking for Autonomous Satellite Capture , 2001 .

[7]  Takeo Kanade,et al.  Real-time 3-D pose estimation using a high-speed range sensor , 1993, Proceedings of the 1994 IEEE International Conference on Robotics and Automation.

[8]  Robert I. Damper,et al.  Object tracking via the dynamic velocity Hough transform , 2001, Proceedings 2001 International Conference on Image Processing (Cat. No.01CH37205).

[9]  Dana H. Ballard,et al.  Generalizing the Hough transform to detect arbitrary shapes , 1981, Pattern Recognit..

[10]  Mohammed Atiquzzaman,et al.  Multiresolution Hough Transform-An Efficient Method of Detecting Patterns in Images , 1992, IEEE Trans. Pattern Anal. Mach. Intell..

[11]  Michael A. Greenspan Geometric Probing of Dense Range Data , 2002, IEEE Trans. Pattern Anal. Mach. Intell..

[12]  Daniele Nardi,et al.  A probabilistic approach to Hough localization , 2001, Proceedings 2001 ICRA. IEEE International Conference on Robotics and Automation (Cat. No.01CH37164).

[13]  W. Eric L. Grimson,et al.  On the Sensitivity of the Hough Transform for Object Recognition , 1990, IEEE Trans. Pattern Anal. Mach. Intell..