暂无分享,去创建一个
[1] Gilad Lerman,et al. A Well-Tempered Landscape for Non-convex Robust Subspace Recovery , 2017, J. Mach. Learn. Res..
[2] Jérôme Malick,et al. Projection-like Retractions on Matrix Manifolds , 2012, SIAM J. Optim..
[3] Maryam Fazel,et al. Escaping from saddle points on Riemannian manifolds , 2019, NeurIPS.
[4] John D. Lafferty,et al. A Convergent Gradient Descent Algorithm for Rank Minimization and Semidefinite Programming from Random Linear Measurements , 2015, NIPS.
[5] Moritz Hardt,et al. Understanding Alternating Minimization for Matrix Completion , 2013, 2014 IEEE 55th Annual Symposium on Foundations of Computer Science.
[6] Javad Lavaei,et al. General Low-rank Matrix Optimization: Geometric Analysis and Sharper Bounds , 2021 .
[7] Qiuwei Li,et al. The non-convex geometry of low-rank matrix optimization , 2016, Information and Inference: A Journal of the IMA.
[8] Nicolas Boumal,et al. Efficiently escaping saddle points on manifolds , 2019, NeurIPS.
[9] Bamdev Mishra,et al. Fixed-rank matrix factorizations and Riemannian low-rank optimization , 2012, Comput. Stat..
[10] Zhaoran Wang,et al. A Nonconvex Optimization Framework for Low Rank Matrix Estimation , 2015, NIPS.
[11] Jian-Feng Cai,et al. Toward the Optimal Construction of a Loss Function Without Spurious Local Minima for Solving Quadratic Equations , 2018, IEEE Transactions on Information Theory.
[12] Pierre-Antoine Absil,et al. RTRMC: A Riemannian trust-region method for low-rank matrix completion , 2011, NIPS.
[13] Haoyang Liu,et al. An equivalence between stationary points for rank constraints versus low-rank factorizations , 2018, 1812.00404.
[14] Javad Lavaei,et al. Sharp Restricted Isometry Bounds for the Inexistence of Spurious Local Minima in Nonconvex Matrix Recovery , 2019, J. Mach. Learn. Res..
[15] Stephen J. Wright,et al. A Line-Search Descent Algorithm for Strict Saddle Functions with Complexity Guarantees , 2020, J. Mach. Learn. Res..
[16] Andrea J. Goldsmith,et al. Exact and Stable Covariance Estimation From Quadratic Sampling via Convex Programming , 2013, IEEE Transactions on Information Theory.
[17] Yuxin Chen,et al. Solving Random Quadratic Systems of Equations Is Nearly as Easy as Solving Linear Systems , 2015, NIPS.
[18] Defeng Sun,et al. A multi-stage convex relaxation approach to noisy structured low-rank matrix recovery , 2017, Math. Program. Comput..
[19] Z. Wen,et al. A Brief Introduction to Manifold Optimization , 2019, Journal of the Operations Research Society of China.
[20] Pierre-Antoine Absil,et al. Trust-Region Methods on Riemannian Manifolds , 2007, Found. Comput. Math..
[21] Yonina C. Eldar,et al. Solving Systems of Random Quadratic Equations via Truncated Amplitude Flow , 2016, IEEE Transactions on Information Theory.
[22] Yuejie Chi,et al. Low-Rank Matrix Recovery with Scaled Subgradient Methods: Fast and Robust Convergence Without the Condition Number , 2020, ArXiv.
[23] Wei Hu,et al. Algorithmic Regularization in Learning Deep Homogeneous Models: Layers are Automatically Balanced , 2018, NeurIPS.
[24] Xiaodong Li,et al. Phase Retrieval via Wirtinger Flow: Theory and Algorithms , 2014, IEEE Transactions on Information Theory.
[25] Yuxin Chen,et al. Nonconvex Optimization Meets Low-Rank Matrix Factorization: An Overview , 2018, IEEE Transactions on Signal Processing.
[26] Junwei Lu,et al. Symmetry. Saddle Points, and Global Optimization Landscape of Nonconvex Matrix Factorization , 2016, 2018 Information Theory and Applications Workshop (ITA).
[27] Bart Vandereycken,et al. Low-Rank Matrix Completion by Riemannian Optimization , 2013, SIAM J. Optim..
[28] Renato D. C. Monteiro,et al. Digital Object Identifier (DOI) 10.1007/s10107-004-0564-1 , 2004 .
[29] Boaz Nadler,et al. Rank 2r Iterative Least Squares: Efficient Recovery of Ill-Conditioned Low Rank Matrices from Few Entries , 2021, SIAM J. Math. Data Sci..
[30] Paul Van Dooren,et al. A Riemannian rank-adaptive method for low-rank optimization , 2016, Neurocomputing.
[31] Shuang Li,et al. The Global Geometry of Centralized and Distributed Low-rank Matrix Recovery Without Regularization , 2020, IEEE Signal Processing Letters.
[32] Wen Huang,et al. Blind Deconvolution by a Steepest Descent Algorithm on a Quotient Manifold , 2017, SIAM J. Imaging Sci..
[33] Furong Huang,et al. Escaping From Saddle Points - Online Stochastic Gradient for Tensor Decomposition , 2015, COLT.
[34] Yuejie Chi,et al. Beyond Procrustes: Balancing-free Gradient Descent for Asymmetric Low-Rank Matrix Sensing , 2019, 2019 53rd Asilomar Conference on Signals, Systems, and Computers.
[35] Prateek Jain,et al. Low-rank matrix completion using alternating minimization , 2012, STOC '13.
[36] U. Helmke,et al. Optimization and Dynamical Systems , 1994, Proceedings of the IEEE.
[37] Anru Zhang,et al. ROP: Matrix Recovery via Rank-One Projections , 2013, ArXiv.
[38] Zhi-Quan Luo,et al. Guaranteed Matrix Completion via Non-Convex Factorization , 2014, IEEE Transactions on Information Theory.
[39] Yin Zhang,et al. Solving a low-rank factorization model for matrix completion by a nonlinear successive over-relaxation algorithm , 2012, Mathematical Programming Computation.
[40] Xiaodong Li,et al. Rapid, Robust, and Reliable Blind Deconvolution via Nonconvex Optimization , 2016, Applied and Computational Harmonic Analysis.
[41] Yuejie Chi,et al. Accelerating Ill-Conditioned Low-Rank Matrix Estimation via Scaled Gradient Descent , 2020, J. Mach. Learn. Res..
[42] Defeng Sun,et al. A rank-corrected procedure for matrix completion with fixed basis coefficients , 2012, Math. Program..
[43] Ellen H. Fukuda,et al. An equivalent nonlinear optimization model with triangular low-rank factorization for semidefinite programs , 2021, Optimization Methods and Software.
[44] Jian-Feng Cai,et al. Exploiting the structure effectively and efficiently in low rank matrix recovery , 2018, ArXiv.
[45] Daphna Weinshall,et al. Online Learning in the Embedded Manifold of Low-rank Matrices , 2012, J. Mach. Learn. Res..
[46] Anastasios Kyrillidis,et al. Dropping Convexity for Faster Semi-definite Optimization , 2015, COLT.
[47] Nathan Srebro,et al. Global Optimality of Local Search for Low Rank Matrix Recovery , 2016, NIPS.
[48] Charles R. Johnson,et al. Uniqueness of matrix square roots and an application , 2001 .
[49] Dmitriy Drusvyatskiy,et al. Low-Rank Matrix Recovery with Composite Optimization: Good Conditioning and Rapid Convergence , 2019, Found. Comput. Math..
[50] Emmanuel J. Candès,et al. PhaseLift: Exact and Stable Signal Recovery from Magnitude Measurements via Convex Programming , 2011, ArXiv.
[51] Recursive Importance Sketching for Rank Constrained Least Squares: Algorithms and High-order Convergence , 2020, ArXiv.
[52] Sujay Sanghavi,et al. The Local Convexity of Solving Systems of Quadratic Equations , 2015, 1506.07868.
[53] Xiao Zhang,et al. A Primal-Dual Analysis of Global Optimality in Nonconvex Low-Rank Matrix Recovery , 2018, ICML.
[54] Xiao Zhang,et al. A Unified Computational and Statistical Framework for Nonconvex Low-rank Matrix Estimation , 2016, AISTATS.
[55] Q. Tran-Dinh. Extended Gauss-Newton and ADMM-Gauss-Newton algorithms for low-rank matrix optimization , 2016 .
[56] John Wright,et al. From Symmetry to Geometry: Tractable Nonconvex Problems , 2020, ArXiv.
[57] Michael I. Jordan,et al. How to Escape Saddle Points Efficiently , 2017, ICML.
[58] Yi Zheng,et al. No Spurious Local Minima in Nonconvex Low Rank Problems: A Unified Geometric Analysis , 2017, ICML.
[59] Martin J. Wainwright,et al. Fast low-rank estimation by projected gradient descent: General statistical and algorithmic guarantees , 2015, ArXiv.
[60] I JordanMichael,et al. First-order methods almost always avoid strict saddle points , 2019 .
[61] Xiaodong Li,et al. Model-free Nonconvex Matrix Completion: Local Minima Analysis and Applications in Memory-efficient Kernel PCA , 2019, J. Mach. Learn. Res..
[62] Stefan Vandewalle,et al. A Riemannian Optimization Approach for Computing Low-Rank Solutions of Lyapunov Equations , 2010, SIAM J. Matrix Anal. Appl..
[63] A. Bandeira,et al. Deterministic Guarantees for Burer‐Monteiro Factorizations of Smooth Semidefinite Programs , 2018, Communications on Pure and Applied Mathematics.
[64] Andrea Montanari,et al. Matrix completion from a few entries , 2009, 2009 IEEE International Symposium on Information Theory.
[65] Anastasios Kyrillidis,et al. Non-square matrix sensing without spurious local minima via the Burer-Monteiro approach , 2016, AISTATS.
[66] Silvere Bonnabel,et al. Linear Regression under Fixed-Rank Constraints: A Riemannian Approach , 2011, ICML.
[67] André Uschmajew,et al. On critical points of quadratic low-rank matrix optimization problems , 2020, IMA Journal of Numerical Analysis.
[68] P. Absil,et al. Erratum to: ``Global rates of convergence for nonconvex optimization on manifolds'' , 2016, IMA Journal of Numerical Analysis.
[69] Max Simchowitz,et al. Low-rank Solutions of Linear Matrix Equations via Procrustes Flow , 2015, ICML.
[70] Tony F. Chan,et al. Guarantees of Riemannian Optimization for Low Rank Matrix Recovery , 2015, SIAM J. Matrix Anal. Appl..
[71] P.-A. Absil,et al. A Riemannian rank-adaptive method for low-rank matrix completion , 2021, Computational Optimization and Applications.
[72] Francis R. Bach,et al. Low-Rank Optimization on the Cone of Positive Semidefinite Matrices , 2008, SIAM J. Optim..
[73] Andi Han,et al. Escape saddle points faster on manifolds via perturbed Riemannian stochastic recursive gradient , 2020, ArXiv.
[74] Prateek Jain,et al. Phase Retrieval Using Alternating Minimization , 2013, IEEE Transactions on Signal Processing.
[75] Yuxin Chen,et al. Implicit Regularization in Nonconvex Statistical Estimation: Gradient Descent Converges Linearly for Phase Retrieval, Matrix Completion, and Blind Deconvolution , 2017, Found. Comput. Math..
[76] J R Fienup,et al. Phase retrieval algorithms: a comparison. , 1982, Applied optics.
[77] Alexandre d'Aspremont,et al. Phase recovery, MaxCut and complex semidefinite programming , 2012, Math. Program..
[78] Defeng Sun,et al. A Majorized Penalty Approach for Calibrating Rank Constrained Correlation Matrix Problems , 2010 .
[79] John Wright,et al. A Geometric Analysis of Phase Retrieval , 2016, 2016 IEEE International Symposium on Information Theory (ISIT).
[80] Kwangjun Ahn,et al. Riemannian Perspective on Matrix Factorization , 2021, ArXiv.
[81] Alan Edelman,et al. The Geometry of Algorithms with Orthogonality Constraints , 1998, SIAM J. Matrix Anal. Appl..
[82] Zhihui Zhu,et al. The Global Optimization Geometry of Low-Rank Matrix Optimization , 2017, IEEE Transactions on Information Theory.
[83] Charles R. Johnson,et al. Matrix analysis , 1985, Statistical Inference for Engineers and Data Scientists.
[84] Levent Tunçel,et al. Optimization algorithms on matrix manifolds , 2009, Math. Comput..
[85] Simon S. Du,et al. Global Convergence of Gradient Descent for Asymmetric Low-Rank Matrix Factorization , 2021, NeurIPS.
[86] Jian-Feng Cai,et al. Solving Systems of Phaseless Equations Via Riemannian Optimization with Optimal Sampling Complexity , 2018, Journal of Computational Mathematics.
[87] Teng Zhang,et al. Robust PCA by Manifold Optimization , 2017, J. Mach. Learn. Res..
[88] Aryan Mokhtari,et al. A Newton-Based Method for Nonconvex Optimization with Fast Evasion of Saddle Points , 2017, SIAM J. Optim..
[89] Bamdev Mishra,et al. Riemannian Preconditioning , 2014, SIAM J. Optim..
[90] Michael I. Jordan,et al. First-order methods almost always avoid strict saddle points , 2019, Mathematical Programming.