Alternating Iteratively Reweighted Minimization Algorithms for Low-Rank Matrix Factorization

Nowadays, the availability of large-scale data in disparate application domains urges the deployment of sophisticated tools for extracting valuable knowledge out of this huge bulk of information. In that vein, low-rank representations (LRRs) which seek low-dimensional embeddings of data have naturally appeared. In an effort to reduce computational complexity and improve estimation performance, LRR has been viewed via a matrix factorization (MF) perspective. Recently, low-rank MF (LRMF) approaches have been proposed for tackling the inherent weakness of MF i.e., the unawareness of the dimension of the low-dimensional space where data reside. Herein, inspired by the merits of iterative reweighted schemes for rank minimization, we come up with a generic low-rank promoting regularization function. Then, focusing on a specific instance of it, we propose a regularizer that imposes column-sparsity jointly on the two matrix factors that result from MF, thus promoting low-rankness on the optimization problem. The problems of denoising, matrix completion and non-negative matrix factorization (NMF) are redefined according to the new LRMF formulation and solved via efficient Newton-type algorithms with proven theoretical guarantees as to their convergence and rates of convergence to stationary points. The effectiveness of the proposed algorithms is verified in diverse simulated and real data experiments.

[1]  Trevor J. Hastie,et al.  Matrix completion and low-rank SVD via fast alternating least squares , 2014, J. Mach. Learn. Res..

[2]  Zhi-Quan Luo,et al.  A Unified Algorithmic Framework for Block-Structured Optimization Involving Big Data: With applications in machine learning and signal processing , 2015, IEEE Signal Processing Magazine.

[3]  Maryam Fazel,et al.  Iterative reweighted algorithms for matrix rank minimization , 2012, J. Mach. Learn. Res..

[4]  C. Févotte,et al.  Automatic Relevance Determination in Nonnegative Matrix Factorization , 2009 .

[5]  Yin Zhang,et al.  Solving a low-rank factorization model for matrix completion by a nonlinear successive over-relaxation algorithm , 2012, Mathematical Programming Computation.

[6]  Zhi-Quan Luo,et al.  A Unified Convergence Analysis of Block Successive Minimization Methods for Nonsmooth Optimization , 2012, SIAM J. Optim..

[7]  Sergios Theodoridis,et al.  Machine Learning: A Bayesian and Optimization Perspective , 2015 .

[8]  Yuanyuan Liu,et al.  Unified Scalable Equivalent Formulations for Schatten Quasi-Norms , 2016, ArXiv.

[9]  I. Daubechies,et al.  Iteratively reweighted least squares minimization for sparse recovery , 2008, 0807.0575.

[10]  Adam Prügel-Bennett,et al.  Rank Selection in Nonnegative Matrix Factorization using Minimum Description Length , 2017, Neural Computation.

[11]  Nicolas Gillis,et al.  Two algorithms for orthogonal nonnegative matrix factorization with application to clustering , 2012, Neurocomputing.

[12]  Paris V. Giampouras,et al.  `1/`2 regularized non-convex low-rank matrix factorization , 2017 .

[13]  Shuicheng Yan,et al.  Generalized Nonconvex Nonsmooth Low-Rank Minimization , 2014, 2014 IEEE Conference on Computer Vision and Pattern Recognition.

[14]  Massimo Fornasier,et al.  Low-rank Matrix Recovery via Iteratively Reweighted Least Squares Minimization , 2010, SIAM J. Optim..

[15]  P. Tseng Convergence of a Block Coordinate Descent Method for Nondifferentiable Minimization , 2001 .

[16]  Zhihui Zhu,et al.  Global Optimality in Low-Rank Matrix Optimization , 2017, IEEE Transactions on Signal Processing.

[17]  Nathan Srebro,et al.  Fast maximum margin matrix factorization for collaborative prediction , 2005, ICML.

[18]  Yuanyuan Liu,et al.  Tractable and Scalable Schatten Quasi-Norm Approximations for Rank Minimization , 2016, AISTATS.

[19]  Pablo A. Parrilo,et al.  Guaranteed Minimum-Rank Solutions of Linear Matrix Equations via Nuclear Norm Minimization , 2007, SIAM Rev..

[20]  Zhi-Quan Luo,et al.  Guaranteed Matrix Completion via Non-Convex Factorization , 2014, IEEE Transactions on Information Theory.

[21]  Changshui Zhang,et al.  Efficient Nonnegative Matrix Factorization via projected Newton method , 2012, Pattern Recognit..

[22]  Amir Beck,et al.  On the Convergence of Alternating Minimization for Convex Programming with Applications to Iteratively Reweighted Least Squares and Decomposition Schemes , 2015, SIAM J. Optim..

[23]  René Vidal,et al.  Structured Low-Rank Matrix Factorization: Optimality, Algorithm, and Applications to Image Processing , 2014, ICML.

[24]  Feiping Nie,et al.  Low-Rank Matrix Recovery via Efficient Schatten p-Norm Minimization , 2012, AAAI.

[25]  Paris V. Giampouras,et al.  Online low-rank subspace learning from incomplete data using rank revealing ℓ2/ℓ1 regularization , 2016, 2016 IEEE Statistical Signal Processing Workshop (SSP).

[26]  Adi Shraibman,et al.  Rank, Trace-Norm and Max-Norm , 2005, COLT.

[27]  Paris V. Giampouras,et al.  Online sparse and low-rank subspace learning from incomplete data: A Bayesian view , 2017, Signal Process..

[28]  Paris V. Giampouras,et al.  Simultaneously Sparse and Low-Rank Abundance Matrix Estimation for Hyperspectral Image Unmixing , 2015, IEEE Transactions on Geoscience and Remote Sensing.

[29]  C. Févotte,et al.  Automatic Relevance Determination in Nonnegative Matrix Factorization with the-Divergence , 2011 .

[30]  Yi Zheng,et al.  No Spurious Local Minima in Nonconvex Low Rank Problems: A Unified Geometric Analysis , 2017, ICML.

[31]  D. Bertsekas Projected Newton methods for optimization problems with simple constraints , 1981, CDC 1981.

[32]  Dimitri P. Bertsekas,et al.  Nonlinear Programming , 1997 .