Pair-activity classification by bi-trajectories analysis

In this paper, we address the pair-activity classification problem, which explores the relationship between two active objects based on their motion information. Our contributions are three-fold. First, we design a set of features, e.g., causality ratio and feedback ratio based on the Granger Causality Test (GCT), for describing the pair-activities encoded as trajectory pairs. These features along with conventional velocity and position features are essentially of multi-modalities, and may be greatly different in scale and importance. To make full use of them, we then present a novel feature normalization procedure to learn the coefficients for weighting these features by maximizing the discriminating power measured by weighted correlation. Finally, we collected a pair-activity database of five categories, each of which consists of about 170 instances. The extensive experiments on this database validate the effectiveness of the designed features for pair-activity representation, and also demonstrate that the proposed feature normalization procedure greatly boosts the pair-activity classification accuracy.

[1]  Ramakant Nevatia,et al.  Event Detection and Analysis from Video Streams , 2001, IEEE Trans. Pattern Anal. Mach. Intell..

[2]  Michal Irani,et al.  Detecting Irregularities in Images and in Video , 2005, ICCV.

[3]  Alex Pentland,et al.  Pfinder: real-time tracking of the human body , 1996, Proceedings of the Second International Conference on Automatic Face and Gesture Recognition.

[4]  Larry S. Davis,et al.  W4: Real-Time Surveillance of People and Their Activities , 2000, IEEE Trans. Pattern Anal. Mach. Intell..

[5]  James M. Rehg,et al.  A Scalable Approach to Activity Recognition based on Object Use , 2007, 2007 IEEE 11th International Conference on Computer Vision.

[6]  Irfan A. Essa,et al.  Exploiting human actions and object context for recognition tasks , 1999, Proceedings of the Seventh IEEE International Conference on Computer Vision.

[7]  Paul A. Viola,et al.  Robust Real-Time Face Detection , 2001, Proceedings Eighth IEEE International Conference on Computer Vision. ICCV 2001.

[8]  David J. Kriegman,et al.  Eigenfaces vs. Fisherfaces: Recognition Using Class Specific Linear Projection , 1996, ECCV.

[9]  Shuicheng Yan,et al.  Detecting Anomaly in Videos from Trajectory Similarity Analysis , 2007, 2007 IEEE International Conference on Multimedia and Expo.

[10]  Rama Chellappa,et al.  From Videos to Verbs: Mining Videos for Activities using a Cascade of Dynamical Systems , 2007, 2007 IEEE Conference on Computer Vision and Pattern Recognition.

[11]  W. Eric L. Grimson,et al.  Learning Patterns of Activity Using Real-Time Tracking , 2000, IEEE Trans. Pattern Anal. Mach. Intell..

[12]  C. Granger Investigating causal relations by econometric models and cross-spectral methods , 1969 .

[13]  Federico Girosi,et al.  Training support vector machines: an application to face detection , 1997, Proceedings of IEEE Computer Society Conference on Computer Vision and Pattern Recognition.

[14]  Jianbo Shi,et al.  Detecting unusual activity in video , 2004, CVPR 2004.

[15]  Shuicheng Yan,et al.  Comparative study: face recognition on unspecific persons using linear subspace methods , 2005, IEEE International Conference on Image Processing 2005.

[16]  Alex Pentland,et al.  Coupled hidden Markov models for complex action recognition , 1997, Proceedings of IEEE Computer Society Conference on Computer Vision and Pattern Recognition.

[17]  Aaron F. Bobick,et al.  Recognition of Visual Activities and Interactions by Stochastic Parsing , 2000, IEEE Trans. Pattern Anal. Mach. Intell..

[18]  Robert T. Collins,et al.  Mean-shift blob tracking through scale space , 2003, 2003 IEEE Computer Society Conference on Computer Vision and Pattern Recognition, 2003. Proceedings..

[19]  C. Sims Money, Income, and Causality , 1972 .

[20]  L. Wray Money , 2010, Solo in the New Order.

[21]  Dorin Comaniciu,et al.  Kernel-Based Object Tracking , 2003, IEEE Trans. Pattern Anal. Mach. Intell..