Tracking of deformable contours by synthesis and match

This paper considers the problem of motion tracking of deformable contours based on synthesis and match. We propose framework which encodes specific information on shape, deformation and motion of the target object. Using this information, the trackers synthesize contours that are most likely to describe the object in the current frame, and perform localization operations to select the best match templates. We present three trackers: the first imposes affine motion smoothness constraints, the second employs principal component analysis to synthesize a codebook of contour templates, while the third combines these ideas to synthesize templates along several major modes of motion. The resulting trackers require only a few parameters to characterize the motion. They are thus suitable for very low bit rate visual communication tasks. Preliminary applications in model-based coding have been attempted.

[1]  Naonori Ueda,et al.  Tracking Moving Contours Using Energy-Minimizing Elastic Contour Models , 1992, ECCV.

[2]  Dana H. Ballard,et al.  Generalizing the Hough transform to detect arbitrary shapes , 1981, Pattern Recognit..

[3]  Parke,et al.  Parameterized Models for Facial Animation , 1982, IEEE Computer Graphics and Applications.

[4]  Richard Szeliski,et al.  Tracking with Kalman snakes , 1993 .

[5]  Michael Isard,et al.  3D position, attitude and shape input using video tracking of hands and lips , 1994, SIGGRAPH.

[6]  Alex Pentland,et al.  View-based and modular eigenspaces for face recognition , 1994, 1994 Proceedings of IEEE Conference on Computer Vision and Pattern Recognition.

[7]  Laurent D. Cohen,et al.  On active contour models and balloons , 1991, CVGIP Image Underst..

[8]  Timothy F. Cootes,et al.  Automatic face identification system using flexible appearance models , 1995, Image Vis. Comput..

[9]  Roland T. Chin,et al.  Deformable contours: modeling and extraction , 1994, 1994 Proceedings of IEEE Conference on Computer Vision and Pattern Recognition.