Dimensionality Reduction with Sparse Locality for Principal Component Analysis

Various dimensionality reduction (DR) schemes have been developed for projecting high-dimensional data into low-dimensional representation. The existing schemes usually preserve either only the global structure or local structure of the original data, but not both. To resolve this issue, a scheme called sparse locality for principal component analysis (SLPCA) is proposed. In order to effectively consider the trade-off between the complexity and efficiency, a robust L2,p-norm-based principal component analysis (R2P-PCA) is introduced for global DR, while sparse representation-based locality preserving projection (SR-LPP) is used for local DR. Sparse representation is also employed to construct the weighted matrix of the samples. Being parameter-free, this allows the construction of an intrinsic graph more robust against the noise. In addition, simultaneous learning of projection matrix and sparse similarity matrix is possible. Experimental results demonstrate that the proposed scheme consistently outperforms the existing schemes in terms of clustering accuracy and data reconstruction error.

[1]  Lei Zhang,et al.  Sparse representation or collaborative representation: Which helps face recognition? , 2011, 2011 International Conference on Computer Vision.

[2]  Chris H. Q. Ding,et al.  R1-PCA: rotational invariant L1-norm principal component analysis for robust subspace factorization , 2006, ICML.

[3]  Jean Thioulouse,et al.  Multivariate analysis of spatial patterns: a unified approach to local and global structures , 1995, Environmental and Ecological Statistics.

[4]  Shuicheng Yan,et al.  Neighborhood preserving embedding , 2005, Tenth IEEE International Conference on Computer Vision (ICCV'05) Volume 1.

[5]  Jie Geng,et al.  Spectral–Spatial Classification of Hyperspectral Image Based on Deep Auto-Encoder , 2016, IEEE Journal of Selected Topics in Applied Earth Observations and Remote Sensing.

[6]  Xuelong Li,et al.  Low-Rank 2-D Neighborhood Preserving Projection for Enhanced Robust Image Representation , 2019, IEEE Transactions on Cybernetics.

[7]  Qingfu Zhang,et al.  Objective Reduction in Many-Objective Optimization: Linear and Nonlinear Algorithms , 2013, IEEE Transactions on Evolutionary Computation.

[8]  Yun Yang,et al.  Hybrid Sampling-Based Clustering Ensemble With Global and Local Constitutions , 2016, IEEE Transactions on Neural Networks and Learning Systems.

[9]  Xuelong Li,et al.  Structurally Incoherent Low-Rank Nonnegative Matrix Factorization for Image Classification , 2018, IEEE Transactions on Image Processing.

[10]  Shuiping Gou,et al.  Classification of PolSAR Images Using Multilayer Autoencoders and a Self-Paced Learning Approach , 2018, Remote. Sens..

[11]  Panos P. Markopoulos,et al.  Adaptive L1-Norm Principal-Component Analysis With Online Outlier Rejection , 2018, IEEE Journal of Selected Topics in Signal Processing.

[12]  Haitao Yu,et al.  Graph Regularized Sparsity Discriminant Analysis for face recognition , 2016, Neurocomputing.

[13]  Thomas Seidl,et al.  Subspace correlation clustering: finding locally correlated dimensions in subspace projections of the data , 2012, KDD.

[14]  Hongxun Yao,et al.  Auto-encoder based dimensionality reduction , 2016, Neurocomputing.

[15]  Lei Wang,et al.  Global and Local Structure Preservation for Feature Selection , 2014, IEEE Transactions on Neural Networks and Learning Systems.

[16]  Nenghai Yu,et al.  Non-negative low rank and sparse graph for semi-supervised learning , 2012, 2012 IEEE Conference on Computer Vision and Pattern Recognition.

[17]  Ke Huang,et al.  Sparse Representation for Signal Classification , 2006, NIPS.

[18]  Junghui Chen,et al.  Multilevel MVU models with localized construction for monitoring processes with large scale data , 2017, Journal of Process Control.

[19]  Philip S. Yu,et al.  Global distance-based segmentation of trajectories , 2006, KDD '06.

[20]  Chengqi Zhang,et al.  Convex Sparse PCA for Unsupervised Feature Learning , 2014, ACM Trans. Knowl. Discov. Data.

[21]  Zhao Zhang,et al.  Trace Ratio Criterion based Discriminative Feature Selection via l2, p-norm regularization for supervised learning , 2018, Neurocomputing.

[22]  Zi Huang,et al.  Proceedings of the Twenty-Second International Joint Conference on Artificial Intelligence ℓ2,1-Norm Regularized Discriminative Feature Selection for Unsupervised Learning , 2022 .

[23]  Jin-Xing Liu,et al.  Joint L1/2-Norm Constraint and Graph-Laplacian PCA Method for Feature Extraction , 2017, BioMed research international.

[24]  Weiqiang Dong On Bias , Variance , 0 / 1-Loss , and the Curse of Dimensionality RK April 13 , 2014 .

[25]  Stephen P. Boyd,et al.  Infeasibility Detection in the Alternating Direction Method of Multipliers for Convex Optimization , 2018, Journal of Optimization Theory and Applications.

[26]  Wei Zhang,et al.  Joint sparse representation and locality preserving projection for feature extraction , 2018, Int. J. Mach. Learn. Cybern..

[27]  Xiaoqiang Lu,et al.  Hybrid structure for robust dimensionality reduction , 2014, Neurocomputing.

[28]  Chun-Hou Zheng,et al.  PCA Based on Graph Laplacian Regularization and P-Norm for Gene Selection and Clustering , 2017, IEEE Transactions on NanoBioscience.

[29]  Wai Keung Wong,et al.  Robust Flexible Preserving Embedding , 2019, IEEE Transactions on Cybernetics.

[30]  Xuelong Li,et al.  Low-Rank Preserving Projections , 2016, IEEE Transactions on Cybernetics.

[31]  Tao Zhang,et al.  A Fast Generalized Low Rank Representation Framework Based on $L_{2,p}$ Norm Minimization for Subspace Clustering , 2017, IEEE Access.

[32]  Feiping Nie,et al.  $\ell _{2,p}$ -Norm Based PCA for Image Recognition , 2018, IEEE Transactions on Image Processing.

[33]  Saraswathi Vishveshwara,et al.  PROTEIN STRUCTURE: INSIGHTS FROM GRAPH THEORY , 2002 .

[34]  S T Roweis,et al.  Nonlinear dimensionality reduction by locally linear embedding. , 2000, Science.

[35]  Fei Wang,et al.  An improved locality preserving projection with ℓ1-norm minimization for dimensionality reduction , 2018, Neurocomputing.

[36]  Mikhail Belkin,et al.  Laplacian Eigenmaps for Dimensionality Reduction and Data Representation , 2003, Neural Computation.

[37]  Yu Shao,et al.  Supervised global-locality preserving projection for plant leaf recognition , 2019, Comput. Electron. Agric..

[38]  Jerome H. Friedman,et al.  On Bias, Variance, 0/1—Loss, and the Curse-of-Dimensionality , 2004, Data Mining and Knowledge Discovery.

[39]  Ming-Ai Li,et al.  An Incremental Version of L-MVU for the Feature Extraction of MI-EEG , 2019, Comput. Intell. Neurosci..

[40]  Jin Tang,et al.  Graph-Laplacian PCA: Closed-Form Solution and Robustness , 2013, 2013 IEEE Conference on Computer Vision and Pattern Recognition.