Low-Rank Matrix Completion in the Presence of High Coherence

Prevalent matrix completion methods capture only the low-rank property which gives merely a constraint that the data points lie on some low-dimensional subspace, but generally ignore the extra structures (beyond low-rank) that specify in more detail how the data points lie on the subspace. Whenever the data points are not uniformly distributed on the low-dimensional subspace, the row-coherence of the target matrix to recover could be considerably high and, accordingly, prevalent methods might fail even if the target matrix is fairly low-rank. To relieve this challenge, we suggest to consider a model termed low-rank factor decomposition (LRFD), which imposes an additional restriction that the data points must be represented as linear, compressive combinations of the bases in a given dictionary. We show that LRFD can effectively mitigate the challenges of high row-coherence, provided that its dictionary is configured properly. Namely, it is mathematically proven that if the dictionary is well-conditioned and low-rank, then LRFD can weaken the dependence on the row-coherence. In particular, if the dictionary itself is low-rank, then the dependence on the row-coherence can be entirely removed. Subsequently, we devise two practical algorithms to obtain proper dictionaries in unsupervised environments: one uses the existing matrix completion methods to construct the dictionary in LRFD, and the other tries to learn a proper dictionary from the data given. Experiments on randomly generated matrices and motion datasets show superior performance of our proposed algorithms.

[1]  Roberto Tron RenVidal A Benchmark for the Comparison of 3-D Motion Segmentation Algorithms , 2007 .

[2]  Andrea Montanari,et al.  Matrix completion from a few entries , 2009, ISIT.

[3]  Yong Yu,et al.  Robust Recovery of Subspace Structures by Low-Rank Representation , 2010, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[4]  Shuicheng Yan,et al.  Exact Subspace Segmentation and Outlier Detection by Low-Rank Representation , 2012, AISTATS.

[5]  Martin J. Wainwright,et al.  Restricted strong convexity and weighted matrix completion: Optimal bounds with noise , 2010, J. Mach. Learn. Res..

[6]  Yong Yu,et al.  Robust Subspace Segmentation by Low-Rank Representation , 2010, ICML.

[7]  C. Richard Johnson,et al.  Matrix Completion Problems: A Survey , 1990 .

[8]  Yi Ma,et al.  Robust principal component analysis? , 2009, JACM.

[9]  Takeo Kanade,et al.  A Multibody Factorization Method for Independently Moving Objects , 1998, International Journal of Computer Vision.

[10]  Akshay Krishnamurthy,et al.  Low-Rank Matrix and Tensor Completion via Adaptive Sampling , 2013, NIPS.

[11]  Weiyu Xu,et al.  Necessary and sufficient conditions for success of the nuclear norm heuristic for rank minimization , 2008, 2008 47th IEEE Conference on Decision and Control.

[12]  Qingshan Liu,et al.  A Deterministic Analysis for LRR , 2016, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[13]  Emmanuel J. Candès,et al.  Exact Matrix Completion via Convex Optimization , 2009, Found. Comput. Math..

[14]  Byron M. Yu,et al.  Deterministic Symmetric Positive Semidefinite Matrix Completion , 2014, NIPS.

[15]  Emmanuel J. Candès,et al.  A Singular Value Thresholding Algorithm for Matrix Completion , 2008, SIAM J. Optim..

[16]  Robert Tibshirani,et al.  Spectral Regularization Algorithms for Learning Large Incomplete Matrices , 2010, J. Mach. Learn. Res..

[17]  G. Sapiro,et al.  A collaborative framework for 3D alignment and classification of heterogeneous subvolumes in cryo-electron tomography. , 2013, Journal of structural biology.

[18]  R. Rockafellar Convex Analysis: (pms-28) , 1970 .

[19]  丸山 徹 Convex Analysisの二,三の進展について , 1977 .

[20]  Gideon Schechtman,et al.  Deterministic algorithms for matrix completion , 2014, Random Struct. Algorithms.

[21]  René Vidal,et al.  Sparse subspace clustering , 2009, CVPR.

[22]  Yi Ma,et al.  The Augmented Lagrange Multiplier Method for Exact Recovery of Corrupted Low-Rank Matrices , 2010, Journal of structural biology.

[23]  Weiyu Xu,et al.  Null space conditions and thresholds for rank minimization , 2011, Math. Program..

[24]  Guangming Shi,et al.  Compressive Sensing via Nonlocal Low-Rank Regularization , 2014, IEEE Transactions on Image Processing.

[25]  Qingshan Liu,et al.  Blessing of Dimensionality: Recovering Mixture Data via Dictionary Pursuit , 2017, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[26]  Alexander J. Smola,et al.  Maximum Margin Matrix Factorization for Collaborative Ranking , 2007 .

[27]  Pablo A. Parrilo,et al.  Guaranteed Minimum-Rank Solutions of Linear Matrix Equations via Nuclear Norm Minimization , 2007, SIAM Rev..

[28]  Maryam Fazel,et al.  New Restricted Isometry results for noisy low-rank recovery , 2010, 2010 IEEE International Symposium on Information Theory.

[29]  Hans-Peter Kriegel,et al.  Subspace clustering , 2012, WIREs Data Mining Knowl. Discov..

[30]  Guangcan Liu,et al.  Recovery of Coherent Data via Low-Rank Dictionary Pursuit , 2014, NIPS.

[31]  Emmanuel J. Candès,et al.  Matrix Completion With Noise , 2009, Proceedings of the IEEE.

[32]  Troy Lee,et al.  Matrix Completion From any Given Set of Observations , 2013, NIPS.

[33]  Hédy Attouch,et al.  On the convergence of the proximal algorithm for nonsmooth functions involving analytic features , 2008, Math. Program..

[34]  Andrea Montanari,et al.  Matrix Completion from Noisy Entries , 2009, J. Mach. Learn. Res..

[35]  Constantine Caramanis,et al.  Robust PCA via Outlier Pursuit , 2010, IEEE Transactions on Information Theory.