Deformable and Occluded Object Tracking via Graph Learning

Object deformation and occlusion are ubiquitous problems for visual tracking. Though many efforts have been made to handle object deformation and occlusion, most existing tracking algorithms fail in case of large deformation and severe occlusion. In this paper, we propose a graph learning-based tracking framework to handle both challenges. For each consecutive frame pair, we construct a weighted graph, in which the nodes are the local parts of both frames. Our algorithm optimizes the graph similarity matrix until two disconnected subgraphs separate the foreground and background nodes. We assign foreground/background labels to the current frame nodes based on the learned graph and estimate the object bounding box under an optimization framework with the predicted foreground parts. Experimental results on the Deform-SOT dataset shows that the proposed method achieves the state-of-the-art performance.

[1]  Jiri Matas,et al.  A Novel Performance Evaluation Methodology for Single-Target Trackers , 2015, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[2]  Lei Zhang,et al.  Fast Compressive Tracking , 2014, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[3]  Patrick Pérez,et al.  Color-Based Probabilistic Tracking , 2002, ECCV.

[4]  Michael J. Black,et al.  EigenTracking: Robust Matching and Tracking of Articulated Objects Using a View-Based Representation , 1996, International Journal of Computer Vision.

[5]  Ming-Hsuan Yang,et al.  Visual tracking with online Multiple Instance Learning , 2009, 2009 IEEE Conference on Computer Vision and Pattern Recognition.

[6]  Changsheng Xu,et al.  Structural Sparse Tracking , 2015, 2015 IEEE Conference on Computer Vision and Pattern Recognition (CVPR).

[7]  Michael Felsberg,et al.  Accurate Scale Estimation for Robust Visual Tracking , 2014, BMVC.

[8]  Horst Bischof,et al.  Semi-supervised On-Line Boosting for Robust Tracking , 2008, ECCV.

[9]  Stanley T. Birchfield,et al.  Adaptive fragments-based tracking of non-rigid objects using level sets , 2009, 2009 IEEE 12th International Conference on Computer Vision.

[10]  Ming Tang,et al.  Robust tracking via weakly supervised ranking SVM , 2012, 2012 IEEE Conference on Computer Vision and Pattern Recognition.

[11]  Huchuan Lu,et al.  Occlusion-Aware Fragment-Based Tracking With Spatial-Temporal Consistency , 2016, IEEE Transactions on Image Processing.

[12]  Rui Caseiro,et al.  High-Speed Tracking with Kernelized Correlation Filters , 2014, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[13]  Junzhou Huang,et al.  Robust tracking using local sparse appearance model and K-selection , 2011, CVPR 2011.

[14]  Robert T. Collins,et al.  On-the-fly Object Modeling while Tracking , 2007, 2007 IEEE Conference on Computer Vision and Pattern Recognition.

[15]  Junseok Kwon,et al.  Visual tracking decomposition , 2010, 2010 IEEE Computer Society Conference on Computer Vision and Pattern Recognition.

[16]  Huchuan Lu,et al.  Robust Superpixel Tracking , 2014, IEEE Transactions on Image Processing.

[17]  Ales Leonardis,et al.  Robust Visual Tracking Using an Adaptive Coupled-Layer Visual Model , 2013, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[18]  Ales Leonardis,et al.  Single target tracking using adaptive clustered decision trees and dynamic multi-level appearance models , 2015, 2015 IEEE Conference on Computer Vision and Pattern Recognition (CVPR).

[19]  Gregory D. Hager,et al.  A Nonparametric Treatment for Location/Segmentation Based Visual Tracking , 2007, 2007 IEEE Conference on Computer Vision and Pattern Recognition.

[20]  Michael Felsberg,et al.  Adaptive Color Attributes for Real-Time Visual Tracking , 2014, 2014 IEEE Conference on Computer Vision and Pattern Recognition.

[21]  Huchuan Lu,et al.  Visual tracking via adaptive structural local sparse appearance model , 2012, 2012 IEEE Conference on Computer Vision and Pattern Recognition.

[22]  Nuno Vasconcelos,et al.  Robust Deformable and Occluded Object Tracking With Dynamic Graph , 2014, IEEE Transactions on Image Processing.

[23]  Pascal Fua,et al.  SLIC Superpixels Compared to State-of-the-Art Superpixel Methods , 2012, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[24]  Junseok Kwon,et al.  Tracking of a non-rigid object via patch-based dynamic appearance modeling and adaptive Basin Hopping Monte Carlo sampling , 2009, CVPR.

[25]  Qi Tian,et al.  Online MIL tracking with instance-level semi-supervised learning , 2014, Neurocomputing.

[26]  Shai Avidan,et al.  Locally Orderless Tracking , 2012, 2012 IEEE Conference on Computer Vision and Pattern Recognition.

[27]  Jitendra Malik,et al.  Tracking as Repeated Figure/Ground Segmentation , 2007, 2007 IEEE Conference on Computer Vision and Pattern Recognition.

[28]  Yanxi Liu,et al.  Online Selection of Discriminative Tracking Features , 2005, IEEE Trans. Pattern Anal. Mach. Intell..

[29]  Feiping Nie,et al.  Clustering and projected clustering with adaptive neighbors , 2014, KDD.

[30]  Shai Avidan,et al.  Ensemble Tracking , 2005, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[31]  Dorin Comaniciu,et al.  Kernel-Based Object Tracking , 2003, IEEE Trans. Pattern Anal. Mach. Intell..

[32]  Qi Zhao,et al.  Co-Tracking Using Semi-Supervised Support Vector Machines , 2007, 2007 IEEE 11th International Conference on Computer Vision.

[33]  Qingming Huang,et al.  Online Deformable Object Tracking Based on Structure-Aware Hyper-Graph , 2016, IEEE Transactions on Image Processing.

[34]  Zdenek Kalal,et al.  Tracking-Learning-Detection , 2012, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[35]  Yi Wu,et al.  Online Object Tracking: A Benchmark , 2013, 2013 IEEE Conference on Computer Vision and Pattern Recognition.

[36]  Lei Zhang,et al.  Real-Time Compressive Tracking , 2012, ECCV.

[37]  Yanning Zhang,et al.  Part-Based Visual Tracking with Online Latent Structural Learning , 2013, 2013 IEEE Conference on Computer Vision and Pattern Recognition.