Transductive Classification by Robust Linear Neighborhood Propagation

We propose an enhanced label prediction method termed Transductive Classification Robust Linear Neighborhood Propagation R-LNP. To encode the neighborhood reconstruction error more accurately, we apply the L2,1-norm that is proved to be very robust to noise for characterizing the manifold smoothing term. Since L2,1-norm can also enforce the neighborhood reconstruction error to be sparse in rows, i.e., entries of some rows are zeros. In addition, to enhance robustness in the process of modeling the difference between the initial labels and predicted ones, we also regularize the weighted L2,1-norm on the label fitting term, so the resulted measures would be more accurate. Compared with several transductive label propagation models, our proposed algorithm obtains state-of-the-art performance over extensive representation and classification experiments.

[1]  Hao Chen,et al.  Prior class dissimilarity based linear neighborhood propagation , 2015, Knowl. Based Syst..

[2]  Feiping Nie,et al.  Unsupervised maximum margin feature selection via L2,1-norm minimization , 2012, Neural Computing and Applications.

[3]  Tommy W. S. Chow,et al.  Automatic image annotation via compact graph based semi-supervised learning , 2015, Knowl. Based Syst..

[4]  Fei Wang,et al.  Label Propagation through Linear Neighborhoods , 2008, IEEE Trans. Knowl. Data Eng..

[5]  Li Zhang,et al.  Semi-Supervised Image Classification by Nonnegative Sparse Neighborhood Propagation , 2015, ICMR.

[6]  Jiang-She Zhang,et al.  Label propagation through sparse neighborhood and its applications , 2012, Neurocomputing.

[7]  B. Brown,et al.  Concepts and Techniques , 1983 .

[8]  Feiping Nie,et al.  A general graph-based semi-supervised learning with novel class discovery , 2010, Neural Computing and Applications.

[9]  Zi Huang,et al.  Proceedings of the Twenty-Second International Joint Conference on Artificial Intelligence ℓ2,1-Norm Regularized Discriminative Feature Selection for Unsupervised Learning , 2022 .

[10]  Zhi-Hua Zhou,et al.  New Semi-Supervised Classification Method Based on Modified Cluster Assumption , 2012, IEEE Transactions on Neural Networks and Learning Systems.

[11]  Xuelong Li,et al.  Joint Embedding Learning and Sparse Regression: A Framework for Unsupervised Feature Selection , 2014, IEEE Transactions on Cybernetics.

[12]  Feiping Nie,et al.  Efficient and Robust Feature Selection via Joint ℓ2, 1-Norms Minimization , 2010, NIPS.

[13]  Zoubin Ghahramani,et al.  Combining active learning and semi-supervised learning using Gaussian fields and harmonic functions , 2003, ICML 2003.

[14]  Bernhard Schölkopf,et al.  Learning with Local and Global Consistency , 2003, NIPS.

[15]  Jian Yang,et al.  Nuclear Norm-Based 2-DPCA for Extracting Features From Images , 2015, IEEE Transactions on Neural Networks and Learning Systems.

[16]  Li Zhang,et al.  Projective Label Propagation by Label Embedding , 2015, CAIP.

[17]  Pengfei Shi,et al.  Laplacian linear discriminant analysis , 2006, Pattern Recognit..

[18]  Zhao Zhang,et al.  Bilinear Embedding Label Propagation: Towards Scalable Prediction of Image Labels , 2015, IEEE Signal Processing Letters.

[19]  Jianzhong Wang,et al.  Locally Linear Embedding , 2021, Unsupervised Learning Approaches for Dimensionality Reduction and Data Visualization.