Graph-dual Laplacian principal component analysis

Principal component analysis is the most widely used method for linear dimensionality reduction, due to its effectiveness in exploring low-dimensional global geometric structures embedded in data. To preserve the intrinsic local geometrical structures of data, graph-Laplacian PCA (gLPCA) incorporates Laplacian embedding into PCA framework for learning local similarities between data points, which leads to significant performance improvement in clustering and classification. Some recent works showed that not only the high dimensional data reside on a low-dimensional manifold in the data space, but also the features lie on a manifold in feature space. However, both PCA and gLPCA overlook the local geometric information contained in the feature space. By considering the duality between data manifold and feature manifold, graph-dual Laplacian PCA (gDLPCA) is proposed, which incorporates data graph regularization and feature graph regularization into PCA framework to exploit local geometric structures of data manifold and feature manifold simultaneously. The experimental results on four benchmark data sets have confirmed its effectiveness and suggested that gDLPCA outperformed gLPCA on classification and clustering tasks.

[1]  Youfu Li,et al.  Feature extraction based on Lp-norm generalized principal component analysis , 2013, Pattern Recognit. Lett..

[2]  Soroosh Sorooshian,et al.  An enhanced artificial neural network with a shuffled complex evolutionary global optimization with principal component analysis , 2017, Inf. Sci..

[3]  Soroosh Sorooshian,et al.  Improving the multi-objective evolutionary optimization algorithm for hydropower reservoir operations in the California Oroville-Thermalito complex , 2015, Environ. Model. Softw..

[4]  Ling Shao,et al.  A rapid learning algorithm for vehicle classification , 2015, Inf. Sci..

[5]  Thomas S. Huang,et al.  Graph Regularized Nonnegative Matrix Factorization for Data Representation. , 2011, IEEE transactions on pattern analysis and machine intelligence.

[6]  Jing Wang,et al.  Generalized 2-D Principal Component Analysis by Lp-Norm for Image Analysis , 2016, IEEE Transactions on Cybernetics.

[7]  Fucai Zhou,et al.  Anomaly detection model of user behavior based on principal component analysis , 2016, J. Ambient Intell. Humaniz. Comput..

[8]  Bin Gu,et al.  Incremental learning for ν-Support Vector Regression , 2015, Neural Networks.

[9]  Yangyang Li,et al.  Self-representation based dual-graph regularized feature selection clustering , 2016, Neurocomputing.

[10]  Maurizio Marchese,et al.  Feature space learning model , 2018, J. Ambient Intell. Humaniz. Comput..

[11]  Quanquan Gu,et al.  Co-clustering on manifolds , 2009, KDD.

[12]  Gang Liu,et al.  Self-adaptive differential evolution with global neighborhood search , 2017, Soft Comput..

[13]  Xavier Bresson,et al.  Robust Principal Component Analysis on Graphs , 2015, 2015 IEEE International Conference on Computer Vision (ICCV).

[14]  Yi Ma,et al.  Robust principal component analysis? , 2009, JACM.

[15]  I. Jolliffe Principal Component Analysis , 2002 .

[16]  Qian Zhu,et al.  Multi-criterion model ensemble of CMIP5 surface air temperature over China , 2018, Theoretical and Applied Climatology.

[17]  Hillol Kargupta,et al.  Distributed Clustering Using Collective Principal Component Analysis , 2001, Knowledge and Information Systems.

[18]  Jin Wang,et al.  Mutual Verifiable Provable Data Auditing in Public Cloud Storage , 2015 .

[19]  Xiaofei He,et al.  Locality Preserving Projections , 2003, NIPS.

[20]  Shuicheng Yan,et al.  Graph Embedding and Extensions: A General Framework for Dimensionality Reduction , 2007 .

[21]  Jieping Ye,et al.  Robust principal component analysis via capped norms , 2013, KDD.

[22]  John Quackenbush,et al.  Genesis: cluster analysis of microarray data , 2002, Bioinform..

[23]  José H. Dulá,et al.  A pure L1L1-norm principal component analysis , 2013, Comput. Stat. Data Anal..

[24]  Fei Wang,et al.  Graph dual regularization non-negative matrix factorization for co-clustering , 2012, Pattern Recognit..

[25]  Zhaolu Guo,et al.  Global harmony search with generalized opposition-based learning , 2015, Soft Computing.

[26]  Mikhail Belkin,et al.  Laplacian Eigenmaps and Spectral Techniques for Embedding and Clustering , 2001, NIPS.

[27]  Jianying Hu,et al.  Regularized Co-Clustering with Dual Supervision , 2008, NIPS.

[28]  Pierre Vandergheynst,et al.  PCA using graph total variation , 2016, 2016 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP).

[29]  S T Roweis,et al.  Nonlinear dimensionality reduction by locally linear embedding. , 2000, Science.

[30]  Jin Tang,et al.  Graph-Laplacian PCA: Closed-Form Solution and Robustness , 2013, 2013 IEEE Conference on Computer Vision and Pattern Recognition.

[31]  Alex Pentland,et al.  Face recognition using eigenfaces , 1991, Proceedings. 1991 IEEE Computer Society Conference on Computer Vision and Pattern Recognition.

[32]  R. Tibshirani,et al.  Sparse Principal Component Analysis , 2006 .

[33]  Bin Gu,et al.  A Robust Regularization Path Algorithm for $\nu $ -Support Vector Classification , 2017, IEEE Transactions on Neural Networks and Learning Systems.

[34]  Nojun Kwak,et al.  Principal Component Analysis by $L_{p}$ -Norm Maximization , 2014, IEEE Transactions on Cybernetics.

[35]  Jian Shen,et al.  A Novel Routing Protocol Providing Good Transmission Reliability in Underwater Sensor Networks , 2015 .

[36]  H. Sebastian Seung,et al.  Learning the parts of objects by non-negative matrix factorization , 1999, Nature.

[37]  Changsheng Xu,et al.  Inductive Robust Principal Component Analysis , 2012, IEEE Transactions on Image Processing.

[38]  Xingming Sun,et al.  Structural Minimax Probability Machine , 2017, IEEE Transactions on Neural Networks and Learning Systems.

[39]  Junbin Gao,et al.  Dual Graph Regularized Latent Low-Rank Representation for Subspace Clustering , 2015, IEEE Transactions on Image Processing.

[40]  Haixian Wang,et al.  Block principal component analysis with L1-norm for image analysis , 2012, Pattern Recognit. Lett..

[41]  David M. W. Powers,et al.  Evaluation: from precision, recall and F-measure to ROC, informedness, markedness and correlation , 2011, ArXiv.

[42]  Haixian Wang,et al.  Locally principal component analysis based on L1-norm maximisation , 2015, IET Image Process..

[43]  Zhenyue Zhang,et al.  Low-Rank Matrix Approximation with Manifold Regularization , 2013, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[44]  Jane You,et al.  Low-rank matrix factorization with multiple Hypergraph regularizer , 2015, Pattern Recognit..

[45]  Tinghuai Ma,et al.  Social Network and Tag Sources Based Augmenting Collaborative Recommender System , 2015, IEICE Trans. Inf. Syst..