Guarantees of riemannian optimization for low rank matrix completion

We study the Riemannian optimization methods on the embedded manifold of low rank matrices for the problem of matrix completion, which is about recovering a low rank matrix from its partial entries. Assume $m$ entries of an $n\times n$ rank $r$ matrix are sampled independently and uniformly with replacement. We first prove that with high probability the Riemannian gradient descent and conjugate gradient descent algorithms initialized by one step hard thresholding are guaranteed to converge linearly to the measured matrix provided \begin{align*} m\geq C_\kappa n^{1.5}r\log^{1.5}(n), \end{align*} where $C_\kappa$ is a numerical constant depending on the condition number of the underlying matrix. The sampling complexity has been further improved to \begin{align*} m\geq C_\kappa nr^2\log^{2}(n) \end{align*} via the resampled Riemannian gradient descent initialization. The analysis of the new initialization procedure relies on an asymmetric restricted isometry property of the sampling operator and the curvature of the low rank matrix manifold. Numerical simulation shows that the algorithms are able to recover a low rank matrix from nearly the minimum number of measurements.

[1]  R. Gerchberg A practical algorithm for the determination of phase from image and diffraction plane pictures , 1972 .

[2]  Stephen P. Boyd,et al.  Semidefinite Programming , 1996, SIAM Rev..

[3]  Stephen J. Wright,et al.  Numerical Optimization , 2018, Fundamental Statistical Inference.

[4]  Rudolf Ahlswede,et al.  Strong converse for identification via quantum channels , 2000, IEEE Trans. Inf. Theory.

[5]  David R. Karger,et al.  The complexity of matrix completion , 2006, SODA '06.

[6]  Massimiliano Pontil,et al.  Multi-Task Feature Learning , 2006, NIPS.

[7]  Robert E. Mahony,et al.  Optimization Algorithms on Matrix Manifolds , 2007 .

[8]  Shimon Ullman,et al.  Uncovering shared structures in multiclass classification , 2007, ICML '07.

[9]  Lars Elden,et al.  Matrix methods in data mining and pattern recognition , 2007, Fundamentals of algorithms.

[10]  Emmanuel J. Candès,et al.  Exact Matrix Completion via Convex Optimization , 2009, Found. Comput. Math..

[11]  Lieven Vandenberghe,et al.  Interior-Point Method for Nuclear Norm Approximation with Application to System Identification , 2009, SIAM J. Matrix Anal. Appl..

[12]  Justin P. Haldar,et al.  Rank-Constrained Solutions to Linear Matrix Equations Using PowerFactorization , 2009, IEEE Signal Processing Letters.

[13]  Andrea Montanari,et al.  Matrix completion from a few entries , 2009, ISIT.

[14]  Emmanuel J. Candès,et al.  A Singular Value Thresholding Algorithm for Matrix Completion , 2008, SIAM J. Optim..

[15]  Pablo A. Parrilo,et al.  Guaranteed Minimum-Rank Solutions of Linear Matrix Equations via Nuclear Norm Minimization , 2007, SIAM Rev..

[16]  Inderjit S. Dhillon,et al.  Guaranteed Rank Minimization via Singular Value Projection , 2009, NIPS.

[17]  Yoram Bresler,et al.  ADMiRA: Atomic Decomposition for Minimum Rank Approximation , 2009, IEEE Transactions on Information Theory.

[18]  Emmanuel J. Candès,et al.  The Power of Convex Relaxation: Near-Optimal Matrix Completion , 2009, IEEE Transactions on Information Theory.

[19]  Emmanuel J. Candès,et al.  Tight oracle bounds for low-rank matrix recovery from a minimal number of random measurements , 2010, ArXiv.

[20]  Shiqian Ma,et al.  Convergence of Fixed-Point Continuation Algorithms for Matrix Rank Minimization , 2009, Found. Comput. Math..

[21]  David Gross,et al.  Recovering Low-Rank Matrices From Few Coefficients in Any Basis , 2009, IEEE Transactions on Information Theory.

[22]  Pierre-Antoine Absil,et al.  RTRMC: A Riemannian trust-region method for low-rank matrix completion , 2011, NIPS.

[23]  Benjamin Recht,et al.  A Simpler Approach to Matrix Completion , 2009, J. Mach. Learn. Res..

[24]  Yin Zhang,et al.  Solving a low-rank factorization model for matrix completion by a nonlinear successive over-relaxation algorithm , 2012, Mathematical Programming Computation.

[25]  Raghunandan H. Keshavan Efficient algorithms for collaborative filtering , 2012 .

[26]  Yousef Saad,et al.  Scaled Gradients on Grassmann Manifolds for Matrix Completion , 2012, NIPS.

[27]  Bamdev Mishra,et al.  A Riemannian geometry for low-rank matrix completion , 2012, ArXiv.

[28]  Bart Vandereycken,et al.  Low-Rank Matrix Completion by Riemannian Optimization , 2013, SIAM J. Optim..

[29]  Prateek Jain,et al.  Low-rank matrix completion using alternating minimization , 2012, STOC '13.

[30]  Jared Tanner,et al.  Normalized Iterative Hard Thresholding for Matrix Completion , 2013, SIAM J. Sci. Comput..

[31]  Volkan Cevher,et al.  Matrix Recipes for Hard Thresholding Methods , 2012, Journal of Mathematical Imaging and Vision.

[32]  Justin K. Romberg,et al.  Blind Deconvolution Using Convex Programming , 2012, IEEE Transactions on Information Theory.

[33]  Bamdev Mishra,et al.  R3MC: A Riemannian three-factor algorithm for low-rank matrix completion , 2013, 53rd IEEE Conference on Decision and Control.

[34]  Bamdev Mishra,et al.  Fixed-rank matrix factorizations and Riemannian low-rank optimization , 2012, Comput. Stat..

[35]  T. Zhao,et al.  Nonconvex Low Rank Matrix Factorization via Inexact First Order Oracle , 2015 .

[36]  Zhi-Quan Luo,et al.  Guaranteed Matrix Completion via Nonconvex Factorization , 2015, FOCS.

[37]  Prateek Jain,et al.  Fast Exact Matrix Completion with Finite Samples , 2014, COLT.

[38]  Yuxin Chen,et al.  Solving Random Quadratic Systems of Equations Is Nearly as Easy as Solving Linear Systems , 2015, NIPS.

[39]  Zhi-Quan Luo,et al.  Guaranteed Matrix Completion via Non-Convex Factorization , 2014, IEEE Transactions on Information Theory.

[40]  Ke Wei Solving systems of phaseless equations via Kaczmarz methods: a proof of concept study , 2015 .

[41]  John D. Lafferty,et al.  A Convergent Gradient Descent Algorithm for Rank Minimization and Semidefinite Programming from Random Linear Measurements , 2015, NIPS.

[42]  Sujay Sanghavi,et al.  The Local Convexity of Solving Systems of Quadratic Equations , 2015, 1506.07868.

[43]  Yudong Chen,et al.  Incoherence-Optimal Matrix Completion , 2013, IEEE Transactions on Information Theory.

[44]  Christopher De Sa,et al.  Global Convergence of Stochastic Gradient Descent for Some Non-convex Matrix Problems , 2014, ICML.

[45]  Jeffrey D. Blanchard,et al.  CGIHT: Conjugate Gradient Iterative Hard Thresholding for Compressed Sensing and Matrix Completion , 2015 .

[46]  Xiaodong Li,et al.  Phase Retrieval via Wirtinger Flow: Theory and Algorithms , 2014, IEEE Transactions on Information Theory.

[47]  Yonina C. Eldar,et al.  Phase Retrieval via Matrix Completion , 2011, SIAM Rev..

[48]  Martin J. Wainwright,et al.  Fast low-rank estimation by projected gradient descent: General statistical and algorithmic guarantees , 2015, ArXiv.

[49]  Anastasios Kyrillidis,et al.  Dropping Convexity for Faster Semi-definite Optimization , 2015, COLT.

[50]  J. Tanner,et al.  Low rank matrix completion by alternating steepest descent methods , 2016 .

[51]  John Wright,et al.  A Geometric Analysis of Phase Retrieval , 2016, International Symposium on Information Theory.

[52]  Max Simchowitz,et al.  Low-rank Solutions of Linear Matrix Equations via Procrustes Flow , 2015, ICML.

[53]  Tony F. Chan,et al.  Guarantees of Riemannian Optimization for Low Rank Matrix Recovery , 2015, SIAM J. Matrix Anal. Appl..

[54]  Thomas Strohmer,et al.  Blind Deconvolution Meets Blind Demixing: Algorithms and Performance Bounds , 2015, IEEE Transactions on Information Theory.