Tracking People and Recognizing Their Activities

We present a system for automatic people tracking and activity recognition. Our basic approach to people-tracking is to build an appearance model for the person in the video. The video illustrates our method of using a stylized-pose detector. Our system builds a model of limb appearance from those sparse stylized detections. Our algorithm then reprocesses the video, using the learned appearance models to find people in unrestricted configuration. We can use our tracker to recover 3D configurations and activity labels. We assume we have a motion capture library where the 3D poses have been labeled offline with activity descriptions.

[1]  David A. Forsyth,et al.  Finding and tracking people from the bottom up , 2003, 2003 IEEE Computer Society Conference on Computer Vision and Pattern Recognition, 2003. Proceedings..

[2]  David A. Forsyth,et al.  Automatic Annotation of Everyday Movements , 2003, NIPS.

[3]  David A. Forsyth,et al.  Strike a pose: tracking people by finding stylized poses , 2005, 2005 IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR'05).