Robust multivariate L1 principal component analysis and dimensionality reduction

Further to our recent work on the robust L1 PCA we introduce a new version of robust PCA model based on the so-called multivariate Laplace distribution (called L1 distribution) proposed in Eltoft et al. [2006. On the multivariate Laplace distribution. IEEE Signal Process. Lett. 13(5), 300-303]. Due to the heavy tail and high component dependency characteristics of the multivariate L1 distribution, the proposed model is expected to be more robust against data outliers and fitting component dependency. Additionally, we demonstrate how a variational approximation scheme enables effective inference of key parameters in the probabilistic multivariate L1-PCA model. By doing so, a tractable Bayesian inference can be achieved based on the variational EM-type algorithm.

[1]  Michel Verleysen,et al.  Robust probabilistic projections , 2006, ICML.

[2]  Chris H. Q. Ding,et al.  R1-PCA: rotational invariant L1-norm principal component analysis for robust subspace factorization , 2006, ICML.

[3]  R. Tibshirani Regression Shrinkage and Selection via the Lasso , 1996 .

[4]  F. Ruymgaart A robust principal component analysis , 1981 .

[5]  P. Sabatier A L 1 -norm Pca and a Heuristic Approach , 1996 .

[6]  Neil D. Lawrence,et al.  Variational inference for Student-t models: Robust Bayesian interpolation and generalised component analysis , 2005, Neurocomputing.

[7]  Frank Dellaert,et al.  Robust Generative Subspace Modeling: The Subspace t Distribution , 2004 .

[8]  Geoffrey J. McLachlan,et al.  Robust mixture modelling using the t distribution , 2000, Stat. Comput..

[9]  Donald B. Rubin Iteratively Reweighted Least Squares , 2006 .

[10]  Michael J. Black,et al.  Robust Principal Component Analysis for Computer Vision , 2001, ICCV.

[11]  Takeo Kanade,et al.  Robust L/sub 1/ norm factorization in the presence of outliers and missing data by alternative convex programming , 2005, 2005 IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR'05).

[12]  Michael J. Black,et al.  Robust principal component analysis for computer vision , 2001, Proceedings Eighth IEEE International Conference on Computer Vision. ICCV 2001.

[13]  Michael E. Tipping,et al.  Probabilistic Principal Component Analysis , 1999 .

[14]  A. Ng Feature selection, L1 vs. L2 regularization, and rotational invariance , 2004, Twenty-first international conference on Machine learning - ICML '04.

[15]  Te-Won Lee,et al.  On the multivariate Laplace distribution , 2006, IEEE Signal Processing Letters.

[16]  Junbin Gao,et al.  Twin Kernel Embedding , 2008, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[17]  Cédric Archambeau,et al.  Probabilistic models in noisy environments : and their application to a visual prosthesis for the blind/ , 2005 .

[18]  Otto Opitz,et al.  Ordinal and Symbolic Data Analysis , 1996 .

[19]  Massimiliano Pontil,et al.  On the Noise Model of Support Vector Machines Regression , 2000, ALT.

[20]  Vojtech Franc,et al.  Robust subspace mixture models using t-distributions , 2003, BMVC.

[21]  Michael J. Black,et al.  A Framework for Robust Subspace Learning , 2003, International Journal of Computer Vision.

[22]  Yi Ma,et al.  Robust principal component analysis? , 2009, JACM.

[23]  Junbin Gao,et al.  Robust L1 Principal Component Analysis and Its Bayesian Variational Inference , 2008, Neural Computation.