A Riemannian rank-adaptive method for low-rank matrix completion

The low-rank matrix completion problem can be solved by Riemannian optimization on a fixed-rank manifold. However, a drawback of the known approaches is that the rank parameter has to be fixed a priori. In this paper, we consider the optimization problem on the set of bounded-rank matrices. We propose a Riemannian rank-adaptive method, which consists of fixed-rank optimization, rank increase step and rank reduction step. We explore its performance applied to the low-rank matrix completion problem. Numerical experiments on synthetic and real-world datasets illustrate that the proposed rank-adaptive method compares favorably with state-of-the-art algorithms. In addition, it shows that one can incorporate each aspect of this rank-adaptive framework separately into existing algorithms for the purpose of improving performance.

[1]  Yuxin Chen,et al.  Nonconvex Optimization Meets Low-Rank Matrix Factorization: An Overview , 2018, IEEE Transactions on Signal Processing.

[2]  Ivor W. Tsang,et al.  Riemannian Pursuit for Big Matrix Recovery , 2014, ICML.

[3]  Bruno Iannazzo,et al.  The Riemannian Barzilai–Borwein method with nonmonotone line search and the matrix geometric mean computation , 2018 .

[4]  Bart Vandereycken,et al.  Low-Rank Matrix Completion by Riemannian Optimization , 2013, SIAM J. Optim..

[5]  Tatjana Stykel,et al.  Riemannian Optimization on the Symplectic Stiefel Manifold , 2021, SIAM J. Optim..

[6]  Z. Wen,et al.  A Brief Introduction to Manifold Optimization , 2019, Journal of the Operations Research Society of China.

[7]  André Uschmajew,et al.  Geometric Methods on Low-Rank Matrix and Tensor Manifolds , 2020 .

[8]  Robert E. Mahony,et al.  Optimization Algorithms on Matrix Manifolds , 2007 .

[9]  Paul Van Dooren,et al.  A Riemannian rank-adaptive method for low-rank optimization , 2016, Neurocomputing.

[10]  P. Absil,et al.  Erratum to: ``Global rates of convergence for nonconvex optimization on manifolds'' , 2016, IMA Journal of Numerical Analysis.

[11]  Bamdev Mishra,et al.  Low-Rank Optimization with Trace Norm Penalty , 2011, SIAM J. Optim..

[12]  Tony F. Chan,et al.  Guarantees of Riemannian Optimization for Low Rank Matrix Recovery , 2015, SIAM J. Matrix Anal. Appl..

[13]  J. Tanner,et al.  Low rank matrix completion by alternating steepest descent methods , 2016 .

[14]  J. Borwein,et al.  Two-Point Step Size Gradient Methods , 1988 .

[15]  John M. Lee Introduction to Smooth Manifolds , 2002 .

[16]  P. Absil,et al.  Low-rank matrix completion via preconditioned optimization on the Grassmann manifold , 2015, Linear Algebra and its Applications.

[17]  Silvere Bonnabel,et al.  Linear Regression under Fixed-Rank Constraints: A Riemannian Approach , 2011, ICML.

[18]  Reinhold Schneider,et al.  Convergence Results for Projected Line-Search Methods on Varieties of Low-Rank Matrices Via Łojasiewicz Inequality , 2014, SIAM J. Optim..

[19]  Bart Vandereycken,et al.  Greedy rank updates combined with Riemannian descent methods for low-rank optimization , 2015, 2015 International Conference on Sampling Theory and Applications (SampTA).

[20]  Byonghyo Shim,et al.  Low-Rank Matrix Completion: A Contemporary Survey , 2019, IEEE Access.