Confidence Region of Singular Subspaces for Low-Rank Matrix Regression

Low-rank matrix regression refers to the instances of recovering a low-rank matrix based on specially designed measurements and the corresponding noisy outcomes. Numerous statistical methods have been developed over the recent decade for efficiently reconstructing the unknown low-rank matrices. It is often interesting, in certain applications, to estimate the unknown singular subspaces. In this paper, we revisit the low-rank matrix regression model and introduce a two-step procedure to construct confidence regions of the singular subspaces. We investigate distributions of the joint projection distance between the empirical singular subspaces and the unknown true singular subspaces. We prove asymptotical normality of the joint projection distance with data-dependent centering and normalization when <inline-formula> <tex-math notation="LaTeX">$r^{3/2}(m_{1}+m_{2})^{3/2}=o(n/\log n)$ </tex-math></inline-formula> where <inline-formula> <tex-math notation="LaTeX">$m_{1}, m_{2}$ </tex-math></inline-formula> denote the matrix row and column sizes, <inline-formula> <tex-math notation="LaTeX">$r$ </tex-math></inline-formula> is the rank and <inline-formula> <tex-math notation="LaTeX">$n$ </tex-math></inline-formula> is the number of independent random measurements. Consequently, data-dependent confidence regions of the true singular subspaces are established which attain pre-determined confidence levels asymptotically. Additionally, non-asymptotic convergence rates are also established. Numerical results are presented to show the merits of our methods.

[1]  A. Bandeira,et al.  Sharp nonasymptotic bounds on the norm of random matrices with independent entries , 2014, 1408.6185.

[2]  Dong Xia Data-dependent Confidence Regions of Singular Subspaces , 2019, ArXiv.

[3]  A. Tsybakov,et al.  Estimation of high-dimensional low-rank matrices , 2009, 0912.5338.

[4]  V. Koltchinskii Von Neumann Entropy Penalization and Low Rank Matrix Estimation , 2010, 1009.2439.

[5]  Ming Yuan,et al.  On Polynomial Time Methods for Exact Low-Rank Tensor Completion , 2017, Found. Comput. Math..

[6]  Vladimir Koltchinskii,et al.  Normal approximation and concentration of spectral projectors of sample covariance , 2015, 1504.07333.

[7]  J. Robins,et al.  Double/Debiased Machine Learning for Treatment and Structural Parameters , 2017 .

[8]  V. Spokoiny,et al.  Bayesian inference for spectral projectors of the covariance matrix , 2017, 1711.11532.

[9]  Alan Edelman,et al.  The Geometry of Algorithms with Orthogonality Constraints , 1998, SIAM J. Matrix Anal. Appl..

[10]  P. Wedin Perturbation bounds in connection with singular value decomposition , 1972 .

[11]  R. Nickl,et al.  Adaptive confidence sets for matrix completion , 2016, Bernoulli.

[12]  Vladimir Koltchinskii,et al.  New Asymptotic Results in Principal Component Analysis , 2016, Sankhya A.

[13]  Chandler Davis The rotation of eigenvectors by a perturbation , 1963 .

[14]  R. Nickl,et al.  Uncertainty Quantification for Matrix Compressed Sensing and Quantum Tomography Problems , 2015, Progress in Probability.

[15]  Cun-Hui Zhang,et al.  Confidence intervals for low dimensional parameters in high dimensional linear models , 2011, 1110.2563.

[16]  Xiucai Ding,et al.  High dimensional deformed rectangular matrices with applications in matrix denoising , 2017, Bernoulli.

[17]  Martin J. Wainwright,et al.  Estimation of (near) low-rank matrices with noise and high-dimensional scaling , 2009, ICML.

[18]  Stephen P. Boyd,et al.  Distributed Optimization and Statistical Learning via the Alternating Direction Method of Multipliers , 2011, Found. Trends Mach. Learn..

[19]  T. Tony Cai,et al.  Confidence intervals for high-dimensional linear regression: Minimax rates and adaptivity , 2015, 1506.05539.

[20]  V. Koltchinskii Asymptotically efficient estimation of smooth functionals of covariance operators , 2017, Journal of the European Mathematical Society.

[21]  J. G. Walker The Phase Retrieval Problem , 1981 .

[22]  H. Zou,et al.  Regularization and variable selection via the elastic net , 2005 .

[23]  David Gross,et al.  Recovering Low-Rank Matrices From Few Coefficients in Any Basis , 2009, IEEE Transactions on Information Theory.

[24]  Xiaodong Li,et al.  Solving Quadratic Equations via PhaseLift When There Are About as Many Equations as Unknowns , 2012, Found. Comput. Math..

[25]  M. Yuan,et al.  Model selection and estimation in regression with grouped variables , 2006 .

[26]  Yazhen Wang Asymptotic equivalence of quantum state tomography and noisy matrix completion , 2013, 1311.4976.

[27]  W. Kahan,et al.  The Rotation of Eigenvectors by a Perturbation. III , 1970 .

[28]  Adel Javanmard,et al.  Confidence intervals and hypothesis testing for high-dimensional regression , 2013, J. Mach. Learn. Res..

[29]  Roman Vershynin,et al.  Introduction to the non-asymptotic analysis of random matrices , 2010, Compressed Sensing.

[30]  R. Tibshirani Regression Shrinkage and Selection via the Lasso , 1996 .

[31]  Joel A. Tropp,et al.  User-Friendly Tail Bounds for Sums of Random Matrices , 2010, Found. Comput. Math..

[32]  V. Koltchinskii,et al.  Estimation of low rank density matrices: bounds in Schatten norms and other distances , 2016, 1604.04600.

[33]  Dong Xia Estimation of low rank density matrices by Pauli measurements , 2016, 1610.04811.

[34]  Emmanuel J. Candès,et al.  PhaseLift: Exact and Stable Signal Recovery from Magnitude Measurements via Convex Programming , 2011, ArXiv.

[35]  Dong Xia,et al.  Perturbation of linear forms of singular vectors under Gaussian noise , 2015 .

[36]  Tengyuan Liang,et al.  Geometric Inference for General High-Dimensional Linear Inverse Problems , 2014, 1404.4408.

[37]  Emmanuel J. Candès,et al.  Tight Oracle Inequalities for Low-Rank Matrix Recovery From a Minimal Number of Noisy Random Measurements , 2011, IEEE Transactions on Information Theory.

[38]  Dong Xia,et al.  Optimal estimation of low rank density matrices , 2015, J. Mach. Learn. Res..

[39]  Arlene K. H. Kim,et al.  An iterative hard thresholding estimator for low rank matrix recovery with explicit limiting distribution , 2015, 1502.04654.

[40]  Yuxin Chen,et al.  Implicit Regularization in Nonconvex Statistical Estimation: Gradient Descent Converges Linearly for Phase Retrieval, Matrix Completion, and Blind Deconvolution , 2017, Found. Comput. Math..

[41]  R. Tibshirani,et al.  A SIGNIFICANCE TEST FOR THE LASSO. , 2013, Annals of statistics.

[42]  Justin K. Romberg,et al.  Blind Deconvolution Using Convex Programming , 2012, IEEE Transactions on Information Theory.

[43]  O. Klopp Rank penalized estimators for high-dimensional matrices , 2011, 1104.1244.

[44]  V. Koltchinskii,et al.  Nuclear norm penalization and optimal rates for noisy low rank matrix completion , 2010, 1011.6256.

[45]  Stephen P. Boyd,et al.  Enhancing Sparsity by Reweighted ℓ1 Minimization , 2007, 0711.1612.

[46]  S. Lloyd,et al.  Quantum principal component analysis , 2013, Nature Physics.

[47]  Emmanuel J. Candès,et al.  Matrix Completion With Noise , 2009, Proceedings of the IEEE.

[48]  Vladimir Koltchinskii,et al.  Asymptotics and Concentration Bounds for Bilinear Forms of Spectral Projectors of Sample Covariance , 2014, 1408.4643.