Tracking in uncalibrated cameras with overlapping field of view

To track people successfully in multiple cameras, one needs to establish correspondence between objects captured in each camera. We present a system for tracking people in multiple uncalibrated cameras. The system is able to discover spatial relationships between the camera fields of view and use this information to correspond between different perspective views of the same person. We employ the novel approach of finding the limits of field of view (FOV) of a camera as visible in the other cameras. Using this information, when a person is seen in one camera, we are able to predict all the other cameras in which this person will be visible. Moreover, we apply the FOV constraint to disambiguate between possible candidates for correspondence. Tracking in each individual camera needs to be resolved before such an analysis can be applied. We perform tracking in a single camera using background subtraction, followed by region correspondence. This takes into account the velocities, sizes and distance of bounding boxes obtained through connected component labeling. We present results on sequences taken from the PETS 2001 dataset, which contain several persons and vehicles simultaneously. The proposed approach is very fast compared to camera calibration based approaches.

[1]  W. Eric L. Grimson,et al.  Learning Patterns of Activity Using Real-Time Tracking , 2000, IEEE Trans. Pattern Anal. Mach. Intell..

[2]  Mubarak Shah,et al.  Establishing motion correspondence , 1991, CVGIP Image Underst..

[3]  Hironobu Fujiyoshi,et al.  Real-time human motion analysis by image skeletonization , 1998, Proceedings Fourth IEEE Workshop on Applications of Computer Vision. WACV'98 (Cat. No.98EX201).

[4]  Jake K. Aggarwal,et al.  Tracking Human Motion in Structured Environments Using a Distributed-Camera System , 1999, IEEE Trans. Pattern Anal. Mach. Intell..

[5]  Ramesh C. Jain,et al.  An architecture for multiple perspective interactive video , 1995, MULTIMEDIA '95.

[6]  Ramin Zabih,et al.  Bayesian multi-camera surveillance , 1999, Proceedings. 1999 IEEE Computer Society Conference on Computer Vision and Pattern Recognition (Cat. No PR00149).

[7]  Ishwar K. Sethi,et al.  Finding Trajectories of Feature Points in a Monocular Image Sequence , 1987, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[8]  Shaogang Gong,et al.  Tracking multiple people with a multi-camera system , 2001, Proceedings 2001 IEEE Workshop on Multi-Object Tracking.

[9]  Cor J. Veenman,et al.  Resolving Motion Correspondence for Densely Moving Points , 2001, IEEE Trans. Pattern Anal. Mach. Intell..

[10]  Mubarak Shah,et al.  Human Tracking in Multiple Cameras , 2001, ICCV.

[11]  Yaacov Ritov,et al.  Tracking Many Objects with Many Sensors , 1999, IJCAI.

[12]  Lily Lee,et al.  Monitoring Activities from Multiple Video Streams: Establishing a Common Coordinate Frame , 2000, IEEE Trans. Pattern Anal. Mach. Intell..