Flexible robust principal component analysis

The error correction problem is a very important topic in machine learning. However, existing methods only focus on data recovery and ignore data compact representation. In this paper, we propose a flexible robust principal component analysis (FRPCA) method in which two different matrices are used to perform error correction and the data compact representation can be obtained by using one of matrices. Moreover, FRPCA selects the most relevant features to guarantee that the recovered data can faithfully preserve the original data semantics. The learning is done by solving a nuclear-norm regularized minimization problem, which is convex and can be solved in polynomial time. Experiments were conducted on image sequences containing targets of interest in a variety of environments, e.g., offices, campuses. We also compare our method with existing method in recovering the face images from corruptions. Experimental results show that the proposed method achieves better performances and it is more practical than the existing approaches.

[1]  David J. Kriegman,et al.  Eigenfaces vs. Fisherfaces: Recognition Using Class Specific Linear Projection , 1996, ECCV.

[2]  R. Maronna Robust $M$-Estimators of Multivariate Location and Scatter , 1976 .

[3]  René Vidal,et al.  Sparse subspace clustering , 2009, CVPR.

[4]  Zheng Zhang,et al.  Adaptive Locality Preserving Regression , 2019, IEEE Transactions on Circuits and Systems for Video Technology.

[5]  Yong Yu,et al.  Robust Recovery of Subspace Structures by Low-Rank Representation , 2010, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[6]  Wai Keung Wong,et al.  Approximate Low-Rank Projection Learning for Feature Extraction , 2018, IEEE Transactions on Neural Networks and Learning Systems.

[7]  R. Tibshirani,et al.  Sparse Principal Component Analysis , 2006 .

[8]  Xuelong Li,et al.  Robust Semi-Supervised Subspace Clustering via Non-Negative Low-Rank Representation , 2016, IEEE Transactions on Cybernetics.

[9]  S. J. Devlin,et al.  Robust Estimation of Dispersion Matrices and Principal Components , 1981 .

[10]  C. Ding,et al.  On the equivalent of low-rank linear regressions and linear discriminant analysis based regressions , 2013, KDD.

[11]  Fan Chung,et al.  Spectral Graph Theory , 1996 .

[12]  Lunke Fei,et al.  Robust Sparse Linear Discriminant Analysis , 2019, IEEE Transactions on Circuits and Systems for Video Technology.

[13]  Isabelle Guyon,et al.  An Introduction to Variable and Feature Selection , 2003, J. Mach. Learn. Res..

[14]  Changsheng Xu,et al.  Inductive Robust Principal Component Analysis , 2012, IEEE Transactions on Image Processing.

[15]  John Wright,et al.  Robust Principal Component Analysis: Exact Recovery of Corrupted Low-Rank Matrices via Convex Optimization , 2009, NIPS.

[16]  Yi Ma,et al.  Robust principal component analysis? , 2009, JACM.

[17]  Karl Pearson F.R.S. LIII. On lines and planes of closest fit to systems of points in space , 1901 .

[18]  Jian Yang,et al.  Rotational Invariant Dimensionality Reduction Algorithms , 2017, IEEE Transactions on Cybernetics.

[19]  Yonina C. Eldar,et al.  Robust Recovery of Signals From a Structured Union of Subspaces , 2008, IEEE Transactions on Information Theory.

[20]  Xiaofei He,et al.  Locality Preserving Projections , 2003, NIPS.

[21]  Takeo Kanade,et al.  A Multibody Factorization Method for Independently Moving Objects , 1998, International Journal of Computer Vision.

[22]  Jian Yang,et al.  Low rank representation with adaptive distance penalty for semi-supervised subspace classification , 2017, Pattern Recognit..

[23]  Songhwai Oh,et al.  Elastic-net regularization of singular values for robust subspace learning , 2015, 2015 IEEE Conference on Computer Vision and Pattern Recognition (CVPR).

[24]  John Wright,et al.  Segmentation of Multivariate Mixed Data via Lossy Data Coding and Compression , 2007, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[25]  Guoying Li,et al.  Projection-Pursuit Approach to Robust Dispersion Matrices and Principal Components: Primary Theory and Monte Carlo , 1985 .

[26]  Yun Fu,et al.  Robust Subspace Discovery through Supervised Low-Rank Constraints , 2014, SDM.

[27]  D. Donoho For most large underdetermined systems of equations, the minimal 𝓁1‐norm near‐solution approximates the sparsest near‐solution , 2006 .

[28]  G. Sapiro,et al.  A collaborative framework for 3D alignment and classification of heterogeneous subvolumes in cryo-electron tomography. , 2013, Journal of structural biology.

[29]  Michael J. Black,et al.  Robust principal component analysis for computer vision , 2001, Proceedings Eighth IEEE International Conference on Computer Vision. ICCV 2001.

[30]  Jian Yang,et al.  Sparse tensor discriminant analysis , 2013, IEEE Transactions on Image Processing.

[31]  Jiawei Han,et al.  Spectral Regression: A Unified Approach for Sparse Subspace Learning , 2007, Seventh IEEE International Conference on Data Mining (ICDM 2007).

[32]  Yong Yu,et al.  Robust Subspace Segmentation by Low-Rank Representation , 2010, ICML.

[33]  Lunke Fei,et al.  Low-Rank Preserving Projection Via Graph Regularized Reconstruction , 2019, IEEE Transactions on Cybernetics.

[34]  Shuicheng Yan,et al.  Neighborhood preserving embedding , 2005, Tenth IEEE International Conference on Computer Vision (ICCV'05) Volume 1.

[35]  Shuicheng Yan,et al.  Graph Embedding and Extensions: A General Framework for Dimensionality Reduction , 2007 .