Semi-supervised low-rank representation graph for pattern recognition

In this study, the authors propose a new semi-supervised low-rank representation graph for pattern recognition. A collection of samples is jointly coded by the recently developed low-rank representation (LRR), which better captures the global structure of data and implements more robust subspace segmentation from corrupted samples. By using the calculated LRR coefficients of both labelled and unlabelled samples as the graph weights, a low-rank representation graph is established in a parameter-free manner under the framework of semi-supervised learning. Some experiments are taken on the benchmark database to investigate the performance of the proposed method and the results show that it is superior to other related semi-supervised graphs.

[1]  Bernhard Schölkopf,et al.  Introduction to Semi-Supervised Learning , 2006, Semi-Supervised Learning.

[2]  Naonori Ueda,et al.  A Hybrid Generative/Discriminative Approach to Semi-Supervised Classifier Design , 2005, AAAI.

[3]  D. Donoho,et al.  Atomic Decomposition by Basis Pursuit , 2001 .

[4]  Avrim Blum,et al.  The Bottleneck , 2021, Monopsony Capitalism.

[5]  Tanaya Guha,et al.  Learning Sparse Representations for Human Action Recognition , 2012, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[6]  Stephen Lin,et al.  Graph Embedding and Extensions: A General Framework for Dimensionality Reduction , 2007, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[7]  Mikhail Belkin,et al.  Regularization and Semi-supervised Learning on Large Graphs , 2004, COLT.

[8]  Mikhail Belkin,et al.  Laplacian Eigenmaps for Dimensionality Reduction and Data Representation , 2003, Neural Computation.

[9]  G. Sapiro,et al.  A collaborative framework for 3D alignment and classification of heterogeneous subvolumes in cryo-electron tomography. , 2013, Journal of structural biology.

[10]  Thorsten Joachims,et al.  Transductive Inference for Text Classification using Support Vector Machines , 1999, ICML.

[11]  Fei Wang,et al.  Semi-Supervised Classification with Universum , 2008, SDM.

[12]  Avrim Blum,et al.  Learning from Labeled and Unlabeled Data using Graph Mincuts , 2001, ICML.

[13]  Balas K. Natarajan,et al.  Sparse Approximate Solutions to Linear Systems , 1995, SIAM J. Comput..

[14]  Mikhail Belkin,et al.  Manifold Regularization: A Geometric Framework for Learning from Labeled and Unlabeled Examples , 2006, J. Mach. Learn. Res..

[15]  Xiaojin Zhu,et al.  --1 CONTENTS , 2006 .

[16]  Santosh S. Venkatesh,et al.  Learning from a mixture of labeled and unlabeled examples with parametric side information , 1995, COLT '95.

[17]  Sebastian Thrun,et al.  Text Classification from Labeled and Unlabeled Documents using EM , 2000, Machine Learning.

[18]  Vikas Sindhwani,et al.  On Manifold Regularization , 2005, AISTATS.

[19]  Allen Y. Yang,et al.  Robust Face Recognition via Sparse Representation , 2009, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[20]  Steven Abney,et al.  Semisupervised Learning for Computational Linguistics , 2007 .

[21]  J. Tenenbaum,et al.  A global geometric framework for nonlinear dimensionality reduction. , 2000, Science.

[22]  Shuicheng Yan,et al.  Semi-supervised Learning by Sparse Representation , 2009, SDM.

[23]  Shuicheng Yan,et al.  Graph Embedding and Extensions: A General Framework for Dimensionality Reduction , 2007 .

[24]  Guillermo Sapiro,et al.  Sparse Representations for Range Data Restoration , 2012, IEEE Transactions on Image Processing.