Robust principal component analysis via optimal mean by joint ℓ2, 1 and Schatten p-norms minimization

Abstract Since principal component analysis (PCA) is sensitive to corrupted variables or observations that affect its performance and applicability in real scenarios, some convex robust PCA methods have been developed to enhance the robustness of PCA. However, most of them neglect the optimal mean calculation problem. They center the data with the mean calculated by the l2-norm, which is incorrect because the l1-norm objective function is used in the following steps. In this paper, we consider a novel robust PCA method that can pursue and remove outliers, exactly recover a low-rank matrix and calculate the optimal mean. Specifically, we propose an optimization model constituted by a l2,1-norm based loss function and a Schatten p-norm regularization term. The l2,1-norm used in loss function aims to pursue and remove outliers, the Schatten p-norm can suppress the singular values of reconstructed data at smaller p (0

[1]  D. Donoho,et al.  Atomic Decomposition by Basis Pursuit , 2001 .

[2]  Wai Keung Wong,et al.  Sparse Alignment for Robust Tensor Learning , 2014, IEEE Transactions on Neural Networks and Learning Systems.

[3]  Jian Yang,et al.  Approximate Orthogonal Sparse Embedding for Dimensionality Reduction , 2016, IEEE Transactions on Neural Networks and Learning Systems.

[4]  Michael J. Black,et al.  A Framework for Robust Subspace Learning , 2003, International Journal of Computer Vision.

[5]  Constantine Caramanis,et al.  Robust PCA via Outlier Pursuit , 2010, IEEE Transactions on Information Theory.

[6]  Rong Wang,et al.  Robust 2DPCA With Non-greedy $\ell _{1}$ -Norm Maximization for Image Analysis , 2015, IEEE Transactions on Cybernetics.

[7]  Bell Telephone,et al.  ROBUST ESTIMATES, RESIDUALS, AND OUTLIER DETECTION WITH MULTIRESPONSE DATA , 1972 .

[8]  Mikhail Belkin,et al.  Laplacian Eigenmaps for Dimensionality Reduction and Data Representation , 2003, Neural Computation.

[9]  Feiping Nie,et al.  Early Active Learning via Robust Representation and Structured Sparsity , 2013, IJCAI.

[10]  Wai Keung Wong,et al.  Joint Tensor Feature Analysis For Visual Object Recognition , 2015, IEEE Transactions on Cybernetics.

[11]  John Wright,et al.  Robust Principal Component Analysis: Exact Recovery of Corrupted Low-Rank Matrices via Convex Optimization , 2009, NIPS.

[12]  Thomas S. Huang,et al.  Graph Regularized Nonnegative Matrix Factorization for Data Representation. , 2011, IEEE transactions on pattern analysis and machine intelligence.

[13]  Ran He,et al.  Principal component analysis based on non-parametric maximum entropy , 2010, Neurocomputing.

[14]  Qianqian Wang,et al.  Robust 2DPCA and Its Application , 2016, 2016 IEEE Conference on Computer Vision and Pattern Recognition Workshops (CVPRW).

[15]  C. Eckart,et al.  The approximation of one matrix by another of lower rank , 1936 .

[16]  Zhenhua Guo,et al.  A Framework of Joint Graph Embedding and Sparse Regression for Dimensionality Reduction , 2015, IEEE Transactions on Image Processing.

[17]  Zhenhua Guo,et al.  Two-Dimensional Whitening Reconstruction for Enhancing Robustness of Principal Component Analysis , 2016, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[18]  David J. Kriegman,et al.  From Few to Many: Illumination Cone Models for Face Recognition under Variable Lighting and Pose , 2001, IEEE Trans. Pattern Anal. Mach. Intell..

[19]  Feiping Nie,et al.  Efficient and Robust Feature Selection via Joint ℓ2, 1-Norms Minimization , 2010, NIPS.

[20]  D. B. Graham,et al.  Characterising Virtual Eigensignatures for General Purpose Face Recognition , 1998 .

[21]  Xuelong Li,et al.  Low-Rank Preserving Projections , 2016, IEEE Transactions on Cybernetics.

[22]  Zhenyu He,et al.  Connected Component Model for Multi-Object Tracking , 2016, IEEE Transactions on Image Processing.

[23]  Chris H. Q. Ding,et al.  Minimal Shrinkage for Noisy Data Recovery Using Schatten-p Norm Objective , 2013, ECML/PKDD.

[24]  Jin Tang,et al.  Graph-Laplacian PCA: Closed-Form Solution and Robustness , 2013, 2013 IEEE Conference on Computer Vision and Pattern Recognition.

[25]  Rong Wang,et al.  Diagonal principal component analysis with non-greedy ℓ1-norm maximization for face recognition , 2016, Neurocomputing.

[26]  Masashi Sugiyama,et al.  Probabilistic principal component analysis based on JoyStick Probability Selector , 2009, 2009 International Joint Conference on Neural Networks.

[27]  Stephen P. Boyd,et al.  Distributed Optimization and Statistical Learning via the Alternating Direction Method of Multipliers , 2011, Found. Trends Mach. Learn..

[28]  Zhenyu He,et al.  Robust Object Tracking via Key Patch Sparse Representation , 2017, IEEE Transactions on Cybernetics.

[29]  Yi Ma,et al.  Robust principal component analysis? , 2009, JACM.

[30]  Feiping Nie,et al.  Low-Rank Matrix Recovery via Efficient Schatten p-Norm Minimization , 2012, AAAI.

[31]  Jian Yang,et al.  Multilinear Sparse Principal Component Analysis , 2014, IEEE Transactions on Neural Networks and Learning Systems.

[32]  J. Tenenbaum,et al.  A global geometric framework for nonlinear dimensionality reduction. , 2000, Science.

[33]  Feiping Nie,et al.  Optimal Mean Robust Principal Component Analysis , 2014, ICML.

[34]  Zhenhua Guo,et al.  Face recognition by sparse discriminant analysis via joint L2, 1-norm minimization , 2014, Pattern Recognit..

[35]  Qianqian Wang,et al.  Two-Dimensional PCA with F-Norm Minimization , 2017, AAAI.

[36]  Takeo Kanade,et al.  Robust L/sub 1/ norm factorization in the presence of outliers and missing data by alternative convex programming , 2005, 2005 IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR'05).