Inference and uncertainty quantification for noisy matrix completion
暂无分享,去创建一个
Yuling Yan | Cong Ma | Jianqing Fan | Yuxin Chen | Jianqing Fan | Yuxin Chen | Cong Ma | Yuling Yan
[1] Andrea J. Goldsmith,et al. Exact and Stable Covariance Estimation From Quadratic Sampling via Convex Programming , 2013, IEEE Transactions on Information Theory.
[2] Martin J. Wainwright,et al. Restricted strong convexity and weighted matrix completion: Optimal bounds with noise , 2010, J. Mach. Learn. Res..
[3] Технология,et al. National Climatic Data Center , 2011 .
[4] Alexander Shapiro,et al. Matrix Completion With Deterministic Pattern: A Geometric Perspective , 2018, IEEE Transactions on Signal Processing.
[5] Inderjit S. Dhillon,et al. Guaranteed Rank Minimization via Singular Value Projection , 2009, NIPS.
[6] Emmanuel J. Candès,et al. Matrix Completion With Noise , 2009, Proceedings of the IEEE.
[7] J. Robins,et al. Double/Debiased Machine Learning for Treatment and Structural Parameters , 2017 .
[8] Qiang Sun,et al. Principal Component Analysis for Big Data , 2018, Wiley StatsRef: Statistics Reference Online.
[9] V. Koltchinskii,et al. Nuclear norm penalization and optimal rates for noisy low rank matrix completion , 2010, 1011.6256.
[10] Adi Shraibman,et al. Rank, Trace-Norm and Max-Norm , 2005, COLT.
[11] Adel Javanmard,et al. Debiasing the lasso: Optimal sample size for Gaussian designs , 2015, The Annals of Statistics.
[12] Chen Cheng,et al. Asymmetry Helps: Eigenvalue and Eigenvector Analyses of Asymmetrically Perturbed Low-Rank Matrices , 2018, ArXiv.
[13] Jianqing Fan,et al. Robust Covariance Estimation for Approximate Factor Models. , 2016, Journal of econometrics.
[14] G. Stewart. On the Perturbation of Pseudo-Inverses, Projections and Linear Least Squares Problems , 1977 .
[15] Pablo A. Parrilo,et al. Guaranteed Minimum-Rank Solutions of Linear Matrix Equations via Nuclear Norm Minimization , 2007, SIAM Rev..
[16] R. Tibshirani,et al. Sparse Principal Component Analysis , 2006 .
[17] Yuxin Chen,et al. Nonconvex Optimization Meets Low-Rank Matrix Factorization: An Overview , 2018, IEEE Transactions on Signal Processing.
[18] Xiaodong Li,et al. Model-free Nonconvex Matrix Completion: Local Minima Analysis and Applications in Memory-efficient Kernel PCA , 2019, J. Mach. Learn. Res..
[19] Yuxin Chen,et al. Solving Random Quadratic Systems of Equations Is Nearly as Easy as Solving Linear Systems , 2015, NIPS.
[20] Yuling Yan,et al. Noisy Matrix Completion: Understanding Statistical Guarantees for Convex Relaxation via Nonconvex Optimization , 2019, SIAM J. Optim..
[21] Dong Xia,et al. Optimal estimation of low rank density matrices , 2015, J. Mach. Learn. Res..
[22] Dong Xia,et al. Confidence interval of singular vectors for high-dimensional and low-rank matrix regression , 2018, ArXiv.
[23] Cun-Hui Zhang,et al. Calibrated Elastic Regularization in Matrix Completion , 2012, NIPS.
[24] L. Wasserman,et al. HIGH DIMENSIONAL VARIABLE SELECTION. , 2007, Annals of statistics.
[25] Wotao Yin,et al. Improved Iteratively Reweighted Least Squares for Unconstrained Smoothed 퓁q Minimization , 2013, SIAM J. Numer. Anal..
[26] Yuxin Chen,et al. Spectral Method and Regularized MLE Are Both Optimal for Top-$K$ Ranking , 2017, Annals of statistics.
[27] Han Liu,et al. A General Theory of Hypothesis Tests and Confidence Regions for Sparse High Dimensional Models , 2014, 1412.8765.
[28] Jianqing Fan,et al. DISTRIBUTED TESTING AND ESTIMATION UNDER SPARSE HIGH DIMENSIONAL MODELS. , 2018, Annals of statistics.
[29] Jianqing Fan,et al. Large covariance estimation by thresholding principal orthogonal complements , 2011, Journal of the Royal Statistical Society. Series B, Statistical methodology.
[30] Sham M. Kakade,et al. Provable Efficient Online Matrix Completion via Non-convex Stochastic Gradient Descent , 2016, NIPS.
[31] P. Bickel,et al. On robust regression with high-dimensional predictors , 2013, Proceedings of the National Academy of Sciences.
[32] Martin J. Wainwright,et al. High-Dimensional Statistics , 2019 .
[33] Pascal Sarda,et al. Factor models and variable selection in high-dimensional regression analysis , 2011 .
[34] Jana Janková,et al. Honest confidence regions and optimality in high-dimensional precision matrix estimation , 2015, TEST.
[35] Yin Zhang,et al. Solving a low-rank factorization model for matrix completion by a nonlinear successive over-relaxation algorithm , 2012, Mathematical Programming Computation.
[36] G. Imbens,et al. Approximate residual balancing: debiased inference of average treatment effects in high dimensions , 2016, 1604.07125.
[37] Yuxin Chen,et al. Implicit Regularization in Nonconvex Statistical Estimation: Gradient Descent Converges Linearly for Phase Retrieval, Matrix Completion, and Blind Deconvolution , 2017, Found. Comput. Math..
[38] Zhaoran Wang,et al. A Nonconvex Optimization Framework for Low Rank Matrix Estimation , 2015, NIPS.
[39] Xiaodong Li,et al. Phase Retrieval via Wirtinger Flow: Theory and Algorithms , 2014, IEEE Transactions on Information Theory.
[40] R. Tibshirani,et al. A SIGNIFICANCE TEST FOR THE LASSO. , 2013, Annals of statistics.
[41] Martin J. Wainwright,et al. Fast low-rank estimation by projected gradient descent: General statistical and algorithmic guarantees , 2015, ArXiv.
[42] Xiaodong Li,et al. Nonconvex Rectangular Matrix Completion via Gradient Descent without $\ell_{2,\infty}$ Regularization , 2019 .
[43] Xiao Zhang,et al. A Unified Computational and Statistical Framework for Nonconvex Low-rank Matrix Estimation , 2016, AISTATS.
[44] Tengyu Ma,et al. Matrix Completion has No Spurious Local Minimum , 2016, NIPS.
[45] B. A. Schmitt. Perturbation bounds for matrix square roots and pythagorean sums , 1992 .
[46] Ji Chen,et al. Nonconvex Rectangular Matrix Completion via Gradient Descent Without ℓ₂,∞ Regularization , 2020, IEEE Transactions on Information Theory.
[47] Benjamin Recht,et al. A Simpler Approach to Matrix Completion , 2009, J. Mach. Learn. Res..
[48] Harrison H. Zhou,et al. Asymptotic normality and optimalities in estimation of large Gaussian graphical models , 2013, 1309.6024.
[49] Nathan Srebro,et al. Concentration-Based Guarantees for Low-Rank Matrix Reconstruction , 2011, COLT.
[50] Yuxin Chen,et al. Gradient descent with random initialization: fast global convergence for nonconvex phase retrieval , 2018, Mathematical Programming.
[51] Xiao Zhang,et al. A Primal-Dual Analysis of Global Optimality in Nonconvex Low-Rank Matrix Recovery , 2018, ICML.
[52] Yuxin Chen,et al. The Projected Power Method: An Efficient Algorithm for Joint Alignment from Pairwise Differences , 2016, Communications on Pure and Applied Mathematics.
[53] J. Pauly,et al. Accelerating parameter mapping with a locally low rank constraint , 2015, Magnetic resonance in medicine.
[54] V. Koltchinskii,et al. Oracle inequalities in empirical risk minimization and sparse recovery problems , 2011 .
[55] Sham M. Kakade,et al. A tail inequality for quadratic forms of subgaussian random vectors , 2011, ArXiv.
[56] Javad Lavaei,et al. Sharp Restricted Isometry Bounds for the Inexistence of Spurious Local Minima in Nonconvex Matrix Recovery , 2019, J. Mach. Learn. Res..
[57] P. Wedin. Perturbation bounds in connection with singular value decomposition , 1972 .
[58] Nathan Srebro,et al. Learning with matrix factorizations , 2004 .
[59] A. Singer. Angular Synchronization by Eigenvectors and Semidefinite Programming. , 2009, Applied and computational harmonic analysis.
[60] Felix Krahmer,et al. On the Convex Geometry of Blind Deconvolution and Matrix Completion , 2019, Communications on Pure and Applied Mathematics.
[61] Jianqing Fan,et al. ENTRYWISE EIGENVECTOR ANALYSIS OF RANDOM MATRICES WITH LOW EXPECTED RANK. , 2017, Annals of statistics.
[62] Adel Javanmard,et al. Hypothesis Testing in High-Dimensional Regression Under the Gaussian Random Design Model: Asymptotic Theory , 2013, IEEE Transactions on Information Theory.
[63] Andrea Montanari,et al. Matrix Completion from Noisy Entries , 2009, J. Mach. Learn. Res..
[64] Massimo Fornasier,et al. Low-rank Matrix Recovery via Iteratively Reweighted Least Squares Minimization , 2010, SIAM J. Optim..
[65] Jianqing Fan,et al. Asymptotic Theory of Eigenvectors for Large Random Matrices , 2019, 1902.06846.
[66] Nathan Srebro,et al. Fast maximum margin matrix factorization for collaborative prediction , 2005, ICML.
[67] Cun-Hui Zhang,et al. Confidence intervals for low dimensional parameters in high dimensional linear models , 2011, 1110.2563.
[68] Adel Javanmard,et al. Confidence intervals and hypothesis testing for high-dimensional regression , 2013, J. Mach. Learn. Res..
[69] Justin K. Romberg,et al. An Overview of Low-Rank Matrix Recovery From Incomplete Observations , 2016, IEEE Journal of Selected Topics in Signal Processing.
[70] Pablo A. Parrilo,et al. Rank-Sparsity Incoherence for Matrix Decomposition , 2009, SIAM J. Optim..
[71] Tengyuan Liang,et al. Geometric Inference for General High-Dimensional Linear Inverse Problems , 2014, 1404.4408.
[72] Yudong Chen,et al. Incoherence-Optimal Matrix Completion , 2013, IEEE Transactions on Information Theory.
[73] Emmanuel J. Candès,et al. Exact Matrix Completion via Convex Optimization , 2008, Found. Comput. Math..
[74] Yuxin Chen,et al. Robust Spectral Compressed Sensing via Structured Matrix Completion , 2013, IEEE Transactions on Information Theory.
[75] Yudong Chen,et al. Leave-One-Out Approach for Matrix Completion: Primal and Dual Analysis , 2018, IEEE Transactions on Information Theory.
[76] Tony F. Chan,et al. Guarantees of Riemannian Optimization for Low Rank Matrix Recovery , 2015, SIAM J. Matrix Anal. Appl..
[77] Sara van de Geer,et al. De-biased sparse PCA: Inference and testing for eigenstructure of large covariance matrices , 2018, 1801.10567.
[78] Yang Cao,et al. Poisson Matrix Recovery and Completion , 2015, IEEE Transactions on Signal Processing.
[79] R. A. Smith. Matrix Equation $XA + BX = C$ , 1968 .
[80] T. Tony Cai,et al. Matrix completion via max-norm constrained optimization , 2013, ArXiv.
[81] Anthony Man-Cho So,et al. Theory of semidefinite programming for Sensor Network Localization , 2005, SODA '05.
[82] Dennis L. Sun,et al. Exact post-selection inference, with application to the lasso , 2013, 1311.6238.
[83] Zhi-Quan Luo,et al. Guaranteed Matrix Completion via Non-Convex Factorization , 2014, IEEE Transactions on Information Theory.
[84] Alexandra Carpentier,et al. On signal detection and confidence sets for low rank inference problems , 2015, 1507.03829.
[85] Peter Bühlmann,et al. p-Values for High-Dimensional Regression , 2008, 0811.2177.
[86] T. Tony Cai,et al. Confidence intervals for high-dimensional linear regression: Minimax rates and adaptivity , 2015, 1506.05539.
[87] Andrea Montanari,et al. Matrix completion from a few entries , 2009, 2009 IEEE International Symposium on Information Theory.
[88] Han Liu,et al. A Unified Theory of Confidence Regions and Testing for High-Dimensional Estimating Equations , 2015, Statistical Science.
[89] Javad Lavaei,et al. No Spurious Solutions in Non-convex Matrix Sensing: Structure Compensates for Isometry , 2021, 2021 American Control Conference (ACC).
[90] Bart Vandereycken,et al. Low-Rank Matrix Completion by Riemannian Optimization , 2013, SIAM J. Optim..
[91] David Gross,et al. Recovering Low-Rank Matrices From Few Coefficients in Any Basis , 2009, IEEE Transactions on Information Theory.
[92] Peter Bühlmann,et al. High-dimensional simultaneous inference with the bootstrap , 2016, 1606.03940.
[93] Robert Tibshirani,et al. Spectral Regularization Algorithms for Learning Large Incomplete Matrices , 2010, J. Mach. Learn. Res..
[94] Yi Zheng,et al. No Spurious Local Minima in Nonconvex Low Rank Problems: A Unified Geometric Analysis , 2017, ICML.
[95] Dmitriy Drusvyatskiy,et al. Composite optimization for robust blind deconvolution , 2019, ArXiv.
[96] Prateek Jain,et al. Low-rank matrix completion using alternating minimization , 2012, STOC '13.
[97] Dong Xia. Data-dependent Confidence Regions of Singular Subspaces , 2019, ArXiv.
[98] S. Geer,et al. Confidence intervals for high-dimensional inverse covariance estimation , 2014, 1403.6752.
[99] Moritz Hardt,et al. Understanding Alternating Minimization for Matrix Completion , 2013, 2014 IEEE 55th Annual Symposium on Foundations of Computer Science.
[100] S. Geer,et al. On asymptotically optimal confidence regions and tests for high-dimensional models , 2013, 1303.0518.
[101] T. Cai,et al. Sparse PCA: Optimal rates and adaptive estimation , 2012, 1211.1309.
[102] Wen-Xin Zhou,et al. A max-norm constrained minimization approach to 1-bit matrix completion , 2013, J. Mach. Learn. Res..
[103] Ewout van den Berg,et al. 1-Bit Matrix Completion , 2012, ArXiv.
[104] A. Tsybakov,et al. Estimation of high-dimensional low-rank matrices , 2009, 0912.5338.
[105] N. Meinshausen,et al. High-Dimensional Inference: Confidence Intervals, $p$-Values and R-Software hdi , 2014, 1408.4026.
[106] A. Carpentier,et al. Constructing confidence sets for the matrix completion problem , 2017, 1704.02760.
[107] Dmitriy Drusvyatskiy,et al. Low-Rank Matrix Recovery with Composite Optimization: Good Conditioning and Rapid Convergence , 2019, Found. Comput. Math..
[108] Yudong Chen,et al. Harnessing Structures in Big Data via Guaranteed Low-Rank Matrix Estimation: Recent Theory and Fast Algorithms via Convex and Nonconvex Optimization , 2018, IEEE Signal Processing Magazine.
[109] R. Nickl,et al. Uncertainty Quantification for Matrix Compressed Sensing and Quantum Tomography Problems , 2015, Progress in Probability.
[110] Javad Lavaei,et al. How Much Restricted Isometry is Needed In Nonconvex Matrix Recovery? , 2018, NeurIPS.
[111] Guang Cheng,et al. Simultaneous Inference for High-Dimensional Linear Models , 2016, 1603.01295.
[112] J. Bai,et al. Confidence Intervals for Diffusion Index Forecasts and Inference for Factor-Augmented Regressions , 2006 .
[113] Max Simchowitz,et al. Low-rank Solutions of Linear Matrix Equations via Procrustes Flow , 2015, ICML.
[114] Noureddine El Karoui,et al. On the impact of predictor geometry on the performance on high-dimensional ridge-regularized generalized robust regression estimators , 2018 .
[115] Yi Ma,et al. Robust principal component analysis? , 2009, JACM.
[116] Dong Xia. Normal approximation and confidence region of singular subspaces , 2021, Electronic Journal of Statistics.
[117] A. Belloni,et al. Inference for High-Dimensional Sparse Econometric Models , 2011, 1201.0220.
[118] John D. Lafferty,et al. Convergence Analysis for Rectangular Matrix Completion Using Burer-Monteiro Factorization and Gradient Descent , 2016, ArXiv.
[119] O. Klopp. Noisy low-rank matrix completion with general sampling distribution , 2012, 1203.0108.
[120] R. Nickl,et al. Adaptive confidence sets for matrix completion , 2016, Bernoulli.
[121] Feng Ruan,et al. Solving (most) of a set of quadratic equalities: Composite optimization for robust phase retrieval , 2017, Information and Inference: A Journal of the IMA.
[122] Junwei Lu,et al. Inter-Subject Analysis: Inferring Sparse Interactions with Dense Intra-Graphs , 2017, 1709.07036.
[123] Simon Mak,et al. Active matrix completion with uncertainty quantification , 2017 .