Background Compensation for Pan-Tilt-Zoom Cameras Using 1-D Feature Matching and Outlier Rejection

This letter proposes an efficient and robust background compensation method for pan-tilt-zoom cameras. The proposed method approximates the relation between consecutive images to a three-parameter similarity transformation, which is separable in horizontal and vertical axes, and extracts and matches 1-D features that are local minima and maxima of intensity projection profiles in each axis. These correspondences are used to estimate transformation parameters via an outlier rejection approach. Experimental results show that the proposed method is more robust with respect to blurring effects and moving object proportion while dramatically decreasing computational costs compared to previous methods.

[1]  Andrew Zisserman,et al.  Multiple View Geometry in Computer Vision (2nd ed) , 2003 .

[2]  Rita Cucchiara,et al.  Fast Dynamic Mosaicing and Person Following , 2006, 18th International Conference on Pattern Recognition (ICPR'06).

[3]  Kyu Tae Park,et al.  Motion based object tracking with mobile camera , 1998 .

[4]  Naokazu Yokoya,et al.  Real-Time Tracking of Multiple Moving Object Contours in a Moving Camera Image Sequence , 2000 .

[5]  Gian Luca Foresti,et al.  Zoom on target while tracking , 2005, IEEE International Conference on Image Processing 2005.

[6]  Ho Gi Jung,et al.  Outlier rejection for cameras on intelligent vehicles , 2008, Pattern Recognit. Lett..

[7]  Ehud Rivlin,et al.  ROR: Rejection of Outliers by Rotations , 2001, IEEE Trans. Pattern Anal. Mach. Intell..

[8]  Gian Luca Foresti,et al.  Real-time image processing for active monitoring of wide areas , 2006, J. Vis. Commun. Image Represent..

[9]  Jae Wook Jeon,et al.  Background compensation using Hough transformation , 2008, 2008 IEEE International Conference on Robotics and Automation.

[10]  Fernando Pereira,et al.  Special Issue on Video Surveillance , 2008, IEEE Trans. Circuits Syst. Video Technol..

[11]  T. Kanade,et al.  A master-slave system to acquire biometric imagery of humans at distance , 2003, IWVS '03.

[12]  Mongi A. Abidi,et al.  Heterogeneous Fusion of Omnidirectional and PTZ Cameras for Multiple Object Tracking , 2008, IEEE Transactions on Circuits and Systems for Video Technology.

[13]  Mohammed Atiquzzaman,et al.  Multiresolution Hough Transform-An Efficient Method of Detecting Patterns in Images , 1992, IEEE Trans. Pattern Anal. Mach. Intell..

[14]  Alessandro Bevilacqua,et al.  A Fast and Reliable Image Mosaicing Technique with Application to Wide Area Motion Detection , 2007, ICIAR.

[15]  Graeme A. Jones Special issue on Intelligent Visual Surveillance , 2008, Comput. Vis. Image Underst..

[16]  Radu Horaud,et al.  Motion Panoramas: Research Articles , 2004 .

[17]  H. Opower Multiple view geometry in computer vision , 2002 .

[18]  Jake K. Aggarwal,et al.  Tracking human motion in an indoor environment , 1995, Proceedings., International Conference on Image Processing.

[19]  Radu Horaud,et al.  Motion Panoramas , 2004, Comput. Animat. Virtual Worlds.

[20]  Bernhard Rinner,et al.  An Introduction to Distributed Smart Cameras , 2008, Proceedings of the IEEE.

[21]  Mongi A. Abidi,et al.  3D Target Scale Estimation and Target Feature Separation for Size Preserving Tracking in PTZ Video , 2008, International Journal of Computer Vision.

[22]  Mongi A. Abidi,et al.  Outlier rejection by oriented tracks to aid pose estimation from video , 2006, Pattern Recognit. Lett..

[23]  Robert C. Bolles,et al.  Random sample consensus: a paradigm for model fitting with applications to image analysis and automated cartography , 1981, CACM.

[24]  David W. Murray,et al.  Ieee Transactions on Pattern Analysis and Machine Intelligence Reactive Control of Zoom While Fixating Using Perspective and Affine Cameras , 2022 .

[25]  Yo-Sung Ho,et al.  Feature-Based Object Tracking with an Active Camera , 2002, IEEE Pacific Rim Conference on Multimedia.

[26]  Anup Basu,et al.  Motion Tracking with an Active Camera , 1994, IEEE Trans. Pattern Anal. Mach. Intell..