Dpca: Dimensionality Reduction for Discriminative Analytics of Multiple Large-Scale Datasets

Principal component analysis (PCA) has well-documented merits for data extraction and dimensionality reduction. PCA deals with a single dataset at a time, and it is challenged when it comes to analyzing multiple datasets. Yet in certain setups, one wishes to extract the most significant information of one dataset relative to other datasets. Specifically, the interest may be on identifying or extracting features that are specific to a single target dataset but not the others. This paper presents a novel approach for such so-termed discriminative data analysis, and establishes its optimality in the least-squares sense under suitable assumptions. The criterion reveals linear combinations of variables by maximizing the ratio of the variance of the target data to that of the remainders. The novel approach solves a generalized eigenvalue problem by performing SVD just once. Numerical tests using synthetic and real datasets showcase the merits of the proposed approach relative to its competing alternatives.

[1]  H. Hotelling Analysis of a complex of statistical variables into principal components. , 1933 .

[2]  Michael I. Jordan,et al.  On Spectral Clustering: Analysis and an algorithm , 2001, NIPS.

[3]  Alex Krizhevsky,et al.  Learning Multiple Layers of Features from Tiny Images , 2009 .

[4]  Ioannis D. Schizas,et al.  Covariance Eigenvector Sparsity for Compression and Denoising , 2012, IEEE Transactions on Signal Processing.

[5]  Bin Yang,et al.  Projection approximation subspace tracking , 1995, IEEE Trans. Signal Process..

[6]  Vivek Kumar Bagaria,et al.  Contrastive Principal Component Analysis , 2017, ArXiv.

[7]  Yousef Saad,et al.  Iterative methods for sparse linear systems , 2003 .

[8]  Bernhard Schölkopf,et al.  Kernel Principal Component Analysis , 1997, ICANN.

[9]  Jia Chen,et al.  Data-driven sensors clustering and filtering for communication efficient field reconstruction , 2017, Signal Process..

[10]  H. Abdi,et al.  Principal component analysis , 2010 .

[11]  Karl Pearson F.R.S. LIII. On lines and planes of closest fit to systems of points in space , 1901 .

[12]  S. Garte The role of ethnicity in cancer susceptibility gene polymorphisms: the example of CYP1A1. , 1998, Carcinogenesis.

[13]  S T Roweis,et al.  Nonlinear dimensionality reduction by locally linear embedding. , 2000, Science.

[14]  Gonzalo Mateos,et al.  Robust PCA as Bilinear Decomposition With Outlier-Sparsity Regularization , 2011, IEEE Transactions on Signal Processing.

[15]  Jia Chen,et al.  Online Distributed Sparsity-Aware Canonical Correlation Analysis , 2016, IEEE Transactions on Signal Processing.

[16]  H. Hotelling Relations Between Two Sets of Variates , 1936 .

[17]  J. Kruskal Multidimensional scaling by optimizing goodness of fit to a nonmetric hypothesis , 1964 .

[18]  R. Fisher THE USE OF MULTIPLE MEASUREMENTS IN TAXONOMIC PROBLEMS , 1936 .

[19]  Yoshua Bengio,et al.  Gradient-based learning applied to document recognition , 1998, Proc. IEEE.

[20]  Jia Chen,et al.  Distributed information-based clustering of heterogeneous sensor data , 2016, Signal Process..

[21]  K. Cios,et al.  Self-Organizing Feature Maps Identify Proteins Critical to Learning in a Mouse Model of Down Syndrome , 2015, PloS one.