Matrix Completion Based on Non-Convex Low-Rank Approximation

Without any prior structure information, nuclear norm minimization (NNM), a convex relaxation for rank minimization (RM), is a widespread tool for matrix completion and relevant low-rank approximation problems. Nevertheless, the result derivated by NNM generally deviates the solution we desired, because NNM ignores the difference between different singular values. In this paper, we present a non-convex regularizer and utilize it to construct two matrix completion models. In order to solve the constructed models efficiently, we develop an efficient optimization method with convergence guarantee, which can achieve faster convergence speed compared to conventional approaches. Particularly, we show that the proposed regularizer as well as the optimization method are suitable for other RM problems, such as subspace clustering based on low-rank representation. Extensive experimental results on real images demonstrate that the constructed models provide significant advantages over several state-of-the-art matrix completion algorithms. In addition, we implement numerous experiments to investigate the convergence speed of the developed optimization method.

[1]  Lei Zhang,et al.  Weighted Nuclear Norm Minimization and Its Applications to Low Level Vision , 2016, International Journal of Computer Vision.

[2]  Constantine Caramanis,et al.  Fast Algorithms for Robust PCA via Gradient Descent , 2016, NIPS.

[3]  Shuicheng Yan,et al.  Smoothed Low Rank and Sparse Matrix Recovery by Iteratively Reweighted Least Squares Minimization , 2014, IEEE Transactions on Image Processing.

[4]  Xuelong Li,et al.  Matrix completion by Truncated Nuclear Norm Regularization , 2012, 2012 IEEE Conference on Computer Vision and Pattern Recognition.

[5]  Feiping Nie,et al.  Joint Capped Norms Minimization for Robust Matrix Recovery , 2017, IJCAI.

[6]  Yong Yu,et al.  Robust Recovery of Subspace Structures by Low-Rank Representation , 2010, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[7]  Ivan W. Selesnick,et al.  Improved sparse low-rank matrix estimation , 2016, Signal Process..

[8]  James T. Kwok,et al.  Efficient Learning with a Family of Nonconvex Regularizers by Redistributing Nonconvexity , 2016, ICML.

[9]  G. Sapiro,et al.  A collaborative framework for 3D alignment and classification of heterogeneous subvolumes in cryo-electron tomography. , 2013, Journal of structural biology.

[10]  In-So Kweon,et al.  Partial Sum Minimization of Singular Values in Robust PCA: Algorithm and Applications , 2015, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[11]  Shuicheng Yan,et al.  Generalized Singular Value Thresholding , 2014, AAAI.

[12]  Tae-Hyun Oh,et al.  Fast Randomized Singular Value Thresholding for Low-Rank Optimization , 2015, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[13]  Shuicheng Yan,et al.  Generalized Nonconvex Nonsmooth Low-Rank Minimization , 2014, 2014 IEEE Conference on Computer Vision and Pattern Recognition.

[14]  Shuicheng Yan,et al.  Practical low-rank matrix approximation under robust L1-norm , 2012, 2012 IEEE Conference on Computer Vision and Pattern Recognition.

[15]  Zhi-Quan Luo,et al.  Bilinear Factor Matrix Norm Minimization for Robust PCA: Algorithms and Applications , 2018, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[16]  Feiping Nie,et al.  Joint Schatten $$p$$p-norm and $$\ell _p$$ℓp-norm robust matrix completion for missing value recovery , 2013, Knowledge and Information Systems.

[17]  Emmanuel J. Candès,et al.  A Singular Value Thresholding Algorithm for Matrix Completion , 2008, SIAM J. Optim..

[18]  Yin Zhang,et al.  An alternating direction algorithm for matrix completion with nonnegative factors , 2011, Frontiers of Mathematics in China.

[19]  Emmanuel J. Candès,et al.  Exact Matrix Completion via Convex Optimization , 2009, Found. Comput. Math..

[20]  Yi Ma,et al.  The Augmented Lagrange Multiplier Method for Exact Recovery of Corrupted Low-Rank Matrices , 2010, Journal of structural biology.

[21]  Soon Ki Jung,et al.  Decomposition into Low-rank plus Additive Matrices for Background/Foreground Separation: A Review for a Comparative Evaluation with a Large-Scale Dataset , 2015, Comput. Sci. Rev..

[22]  Hongdong Li,et al.  A simple prior-free method for non-rigid structure-from-motion factorization , 2012, CVPR.

[23]  Maryam Fazel,et al.  Iterative reweighted algorithms for matrix rank minimization , 2012, J. Mach. Learn. Res..

[24]  Yehuda Koren,et al.  Collaborative filtering with temporal dynamics , 2009, KDD.

[25]  Larry S. Davis,et al.  Truncated Cauchy Non-Negative Matrix Factorization , 2019, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[26]  Yin Zhang,et al.  Solving a low-rank factorization model for matrix completion by a nonlinear successive over-relaxation algorithm , 2012, Mathematical Programming Computation.

[27]  Tommi S. Jaakkola,et al.  Maximum-Margin Matrix Factorization , 2004, NIPS.

[28]  Xuelong Li,et al.  Calibrated Multi-Task Learning , 2018, KDD.

[29]  Viktor Larsson,et al.  Convex Low Rank Approximation , 2016, International Journal of Computer Vision.

[30]  Viktor Larsson,et al.  Non-convex Rank/Sparsity Regularization and Local Minima , 2017, 2017 IEEE International Conference on Computer Vision (ICCV).

[31]  Feiping Nie,et al.  Low-Rank Matrix Recovery via Efficient Schatten p-Norm Minimization , 2012, AAAI.

[32]  Alexandre Bernardino,et al.  Unifying Nuclear Norm and Bilinear Factorization Approaches for Low-Rank Matrix Decomposition , 2013, 2013 IEEE International Conference on Computer Vision.

[33]  Ivan W. Selesnick,et al.  Enhanced Low-Rank Matrix Approximation , 2015, IEEE Signal Processing Letters.