Correlation filter-based self-paced object tracking

Object tracking is an important capability for robots tasked with interacting with humans and the environment, and it enables robots to manipulate objects. In object tracking, selecting samples to learn a robust and efficient appearance model is a challenging task. Model learning determines both the strategy and frequency of model updating, which concerns many details that can affect the tracking results. In this paper, we propose an object tracking approach by formulating a new objective function that integrates the learning paradigm of self-paced learning into object tracking such that reliable samples can be automatically selected for model learning. Sample weights and model parameters can be learned by minimizing this single objective function under the framework of kernelized correlation filters. Moreover, a real-valued error-tolerant self-paced function with a constraint vector is proposed to combine prior knowledge, i.e., the characteristics of object tracking, with information learned during tracking. We demonstrate the robustness and efficiency of our object tracking approach on a recent object tracking benchmark data set: OTB 2013.

[1]  Ming-Hsuan Yang,et al.  Visual tracking with online Multiple Instance Learning , 2009, 2009 IEEE Conference on Computer Vision and Pattern Recognition.

[2]  Zdenek Kalal,et al.  Tracking-Learning-Detection , 2012, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[3]  Michael Felsberg,et al.  Accurate Scale Estimation for Robust Visual Tracking , 2014, BMVC.

[4]  Junseok Kwon,et al.  Tracking by Sampling Trackers , 2011, 2011 International Conference on Computer Vision.

[5]  Jason Jianjun Gu,et al.  Compressive sensing with Weighted Local Classifiers for robot Visual tracking , 2016, Int. J. Robotics Autom..

[6]  Kathrin Klamroth,et al.  Biconvex sets and optimization with biconvex functions: a survey and extensions , 2007, Math. Methods Oper. Res..

[7]  Seunghoon Hong,et al.  Online Tracking by Learning Discriminative Saliency Map with Convolutional Neural Network , 2015, ICML.

[8]  Gérard G. Medioni,et al.  Context tracker: Exploring supporters and distracters in unconstrained environments , 2011, CVPR 2011.

[9]  Dit-Yan Yeung,et al.  Understanding and Diagnosing Visual Tracking Systems , 2015, 2015 IEEE International Conference on Computer Vision (ICCV).

[10]  Rui Caseiro,et al.  High-Speed Tracking with Kernelized Correlation Filters , 2014, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[11]  Huchuan Lu,et al.  Robust object tracking via sparsity-based collaborative model , 2012, 2012 IEEE Conference on Computer Vision and Pattern Recognition.

[12]  Yi Wu,et al.  Online Object Tracking: A Benchmark , 2013, 2013 IEEE Conference on Computer Vision and Pattern Recognition.

[13]  Jason Jianjun Gu,et al.  Single landmark based self-localization of mobile robots , 2006, The 3rd Canadian Conference on Computer and Robot Vision (CRV'06).

[14]  Junseok Kwon,et al.  Visual tracking decomposition , 2010, 2010 IEEE Computer Society Conference on Computer Vision and Pattern Recognition.

[15]  Daphne Koller,et al.  Self-Paced Learning for Latent Variable Models , 2010, NIPS.

[16]  Shiguang Shan,et al.  Self-Paced Curriculum Learning , 2015, AAAI.

[17]  Rui Caseiro,et al.  Exploiting the Circulant Structure of Tracking-by-Detection with Kernels , 2012, ECCV.

[18]  Qingshan Liu,et al.  Robust Visual Tracking via Convolutional Networks Without Training , 2015, IEEE Transactions on Image Processing.

[19]  Huchuan Lu,et al.  Robust Object Tracking via Sparse Collaborative Appearance Model , 2014, IEEE Transactions on Image Processing.

[20]  Huchuan Lu,et al.  Visual tracking via adaptive structural local sparse appearance model , 2012, 2012 IEEE Conference on Computer Vision and Pattern Recognition.

[21]  Jason Jianjun Gu,et al.  Development and Evaluation of Object-Based Visual Attention for Automatic Perception of Robots , 2013, IEEE Transactions on Automation Science and Engineering.