Orthogonal self-guided similarity preserving projections

In this paper, we propose a novel unsupervised dimensionality reduction (DR) method called orthogonal self-guided similarity preserving projections (OSSPP), which seamlessly integrates the procedures of an adjacency graph learning and DR into a one step. Specifically, OSSPP projects the data into a low-dimensional subspace and simultaneously performs similarity preserving learning by using the similarity preserving regularization term in which the reconstruction coefficients of the projected data are used to encode the similarity structure information. An interesting finding is that the problem to determine the reconstruction coefficients can be converted into a weighted non-negative sparse coding problem without any explicit sparsity constraint. Thus the projections obtained by OSSPP contain natural discriminating information. Experimental results demonstrate that OSSPP outperforms state-of-the-art methods in DR.

[1]  Mikhail Belkin,et al.  Laplacian Eigenmaps and Spectral Techniques for Embedding and Clustering , 2001, NIPS.

[2]  Xiaofei He,et al.  Locality Preserving Projections , 2003, NIPS.

[3]  Xiaojin Zhu,et al.  --1 CONTENTS , 2006 .

[4]  Shuicheng Yan,et al.  Neighborhood preserving embedding , 2005, Tenth IEEE International Conference on Computer Vision (ICCV'05) Volume 1.

[5]  Bernhard Schölkopf,et al.  Local learning projections , 2007, ICML '07.

[6]  Stephen Lin,et al.  Graph Embedding and Extensions: A General Framework for Dimensionality Reduction , 2007, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[7]  Jiawei Han,et al.  Semi-supervised Discriminant Analysis , 2007, 2007 IEEE 11th International Conference on Computer Vision.

[8]  Kun Zhou,et al.  Locality Sensitive Discriminant Analysis , 2007, IJCAI.

[9]  Shuicheng Yan,et al.  Semi-supervised Learning by Sparse Representation , 2009, SDM.

[10]  Yin Zhang,et al.  User's Guide for YALL1: Your ALgorithms for L1 Optimization , 2009 .

[11]  Yong Yu,et al.  Robust Subspace Segmentation by Low-Rank Representation , 2010, ICML.

[12]  Xuelong Li,et al.  L1-Norm-Based 2DPCA , 2010, IEEE Transactions on Systems, Man, and Cybernetics, Part B (Cybernetics).

[13]  Xiaoyang Tan,et al.  Pattern Recognition , 2016, Communications in Computer and Information Science.

[14]  Junfeng Yang,et al.  Alternating Direction Algorithms for 1-Problems in Compressive Sensing , 2009, SIAM J. Sci. Comput..

[15]  Chun Chen,et al.  Graph Regularized Sparse Coding for Image Representation , 2011, IEEE Transactions on Image Processing.

[16]  Ran He,et al.  Nonnegative sparse coding for discriminative semi-supervised learning , 2011, CVPR 2011.

[17]  Ting Wang,et al.  Kernel Sparse Representation-Based Classifier , 2012, IEEE Transactions on Signal Processing.

[18]  Bingbing Ni,et al.  Learning a Propagable Graph for Semisupervised Learning: Classification and Regression , 2012, IEEE Transactions on Knowledge and Data Engineering.

[19]  Nenghai Yu,et al.  Non-negative low rank and sparse graph for semi-supervised learning , 2012, 2012 IEEE Conference on Computer Vision and Pattern Recognition.

[20]  Jian Yang,et al.  Sparse Approximation to the Eigensubspace for Discrimination , 2012, IEEE Transactions on Neural Networks and Learning Systems.

[21]  Xiaojin Zhu,et al.  p-voltages: Laplacian Regularization for Semi-Supervised Learning on High-Dimensional Data , 2013 .

[22]  René Vidal,et al.  Latent Space Sparse Subspace Clustering , 2013, 2013 IEEE International Conference on Computer Vision.

[23]  René Vidal,et al.  Sparse Subspace Clustering: Algorithm, Theory, and Applications , 2012, IEEE transactions on pattern analysis and machine intelligence.

[24]  Jian Yang,et al.  Sparse Representation Classifier Steered Discriminative Projection With Applications to Face Recognition , 2013, IEEE Transactions on Neural Networks and Learning Systems.

[25]  Jian Yang,et al.  Modified Principal Component Analysis: An Integration of Multiple Similarity Subspace Models , 2014, IEEE Transactions on Neural Networks and Learning Systems.