Low-Rank Matrix Estimation From Rank-One Projections by Unlifted Convex Optimization

We study an estimator with a convex formulation for recovery of low-rank matrices from rank-one projections. Using initial estimates of the factors of the target $d_1\times d_2$ matrix of rank-$r$, the estimator operates as a standard quadratic program in a space of dimension $r(d_1+d_2)$. This property makes the estimator significantly more scalable than the convex estimators based on lifting and semidefinite programming. Furthermore, we present a streamlined analysis for exact recovery under the real Gaussian measurement model, as well as the partially derandomized measurement model by using the spherical 2-design. We show that under both models the estimator succeeds, with high probability, if the number of measurements exceeds $r^2 (d_1+d_2)$ up to some logarithmic factors. This sample complexity improves on the existing results for nonconvex iterative algorithms.

[1]  J. Landsberg Tensors: Geometry and Applications , 2011 .

[2]  Charles R. Johnson,et al.  Matrix analysis , 1985, Statistical Inference for Engineers and Data Scientists.

[3]  Felix Krahmer,et al.  A Partial Derandomization of PhaseLift Using Spherical Designs , 2013, Journal of Fourier Analysis and Applications.

[4]  Sohail Bahmani,et al.  Estimation from Non-Linear Observations via Convex Programming with Application to Bilinear Regression , 2018, Electronic Journal of Statistics.

[5]  S. Szarek,et al.  Chapter 8 - Local Operator Theory, Random Matrices and Banach Spaces , 2001 .

[6]  Soumendu Sundar Mukherjee,et al.  Weak convergence and empirical processes , 2019 .

[7]  M. Talagrand,et al.  Probability in Banach Spaces: Isoperimetry and Processes , 1991 .

[8]  R. Paley,et al.  A note on analytic functions in the unit circle , 1932, Mathematical Proceedings of the Cambridge Philosophical Society.

[9]  Jieping Ye,et al.  A Non-convex One-Pass Framework for Generalized Factorization Machine and Rank-One Matrix Sensing , 2016, NIPS.

[10]  Inderjit S. Dhillon,et al.  Efficient Matrix Sensing Using Rank-1 Gaussian Measurements , 2015, ALT.

[11]  Joel A. Tropp,et al.  User-Friendly Tail Bounds for Sums of Random Matrices , 2010, Found. Comput. Math..

[12]  Shahar Mendelson,et al.  Learning without Concentration , 2014, COLT.

[13]  Yuxin Chen,et al.  Nonconvex Matrix Factorization from Rank-One Measurements , 2019, AISTATS.

[14]  Jon A. Wellner,et al.  Weak Convergence and Empirical Processes: With Applications to Statistics , 1996 .

[15]  Andrea J. Goldsmith,et al.  Exact and Stable Covariance Estimation From Quadratic Sampling via Convex Programming , 2013, IEEE Transactions on Information Theory.

[16]  Anru Zhang,et al.  ROP: Matrix Recovery via Rank-One Projections , 2013, ArXiv.

[17]  Sujay Sanghavi,et al.  The Local Convexity of Solving Systems of Quadratic Equations , 2015, 1506.07868.

[18]  Colin McDiarmid,et al.  Surveys in Combinatorics, 1989: On the method of bounded differences , 1989 .

[19]  F. M. Dopico A Note on Sin Θ Theorems for Singular Subspace Variations , 2000 .

[20]  C. Hegde,et al.  Improved Algorithms for Matrix Recovery from Rank-One Projections , 2017, 1705.07469.

[21]  Holger Rauhut,et al.  Low rank matrix recovery from rank one measurements , 2014, ArXiv.

[22]  V. Koltchinskii,et al.  Bounding the smallest singular value of a random matrix without concentration , 2013, 1312.3580.

[23]  M. Junge,et al.  Noncommutative Bennett and Rosenthal inequalities , 2011, 1111.1027.

[24]  B. A. Schmitt Perturbation bounds for matrix square roots and pythagorean sums , 1992 .

[25]  E. Giné,et al.  Decoupling: From Dependence to Independence , 1998 .