Riemannian Conjugate Gradient Descent Method for Third-Order Tensor Completion

The goal of tensor completion is to fill in missing entries of a partially known tensor under a low-rank constraint. In this paper, we mainly study low rank third-order tensor completion problems by using Riemannian optimization methods on the smooth manifold. Here the tensor rank is defined to be a set of matrix ranks where the matrices are the slices of the transformed tensor obtained by applying the Fourier-related transformation onto the tubes of the original tensor. We show that with suitable incoherence conditions on the underlying low rank tensor, the proposed Riemannian optimization method is guaranteed to converge and find such low rank tensor with a high probability. In addition, numbers of sample entries required for solving low rank tensor completion problem under different initialized methods are studied and derived. Numerical examples for both synthetic and image data sets are reported to demonstrate the proposed method is able to recover low rank tensors.

[1]  T. Chan,et al.  Guarantees of riemannian optimization for low rank matrix completion , 2016, Inverse Problems & Imaging.

[2]  Bohua Zhan,et al.  Smooth Manifolds , 2021, Arch. Formal Proofs.

[3]  Bamdev Mishra,et al.  A Dual Framework for Low-rank Tensor Completion , 2017, NeurIPS.

[4]  Volker Schulz,et al.  A Riemannian trust‐region method for low‐rank tensor completion , 2017, Numer. Linear Algebra Appl..

[5]  Michael K. Ng,et al.  Exact Tensor Completion from Sparsely Corrupted Observations via Convex Optimization , 2017, ArXiv.

[6]  Zemin Zhang,et al.  Exact Tensor Completion Using t-SVD , 2015, IEEE Transactions on Signal Processing.

[7]  Lars Karlsson,et al.  Parallel algorithms for tensor completion in the CP format , 2016, Parallel Comput..

[8]  Wei Liu,et al.  Tensor Robust Principal Component Analysis: Exact Recovery of Corrupted Low-Rank Tensors via Convex Optimization , 2016, 2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR).

[9]  Hiroyuki Kasai,et al.  Low-rank tensor completion: a Riemannian manifold preconditioning approach , 2016, ICML.

[10]  Minh N. Do,et al.  Efficient tensor completion: Low-rank tensor train , 2016, ArXiv.

[11]  Tony F. Chan,et al.  Guarantees of Riemannian Optimization for Low Rank Matrix Recovery , 2015, SIAM J. Matrix Anal. Appl..

[12]  Ming Yuan,et al.  On Tensor Completion via Nuclear Norm Minimization , 2014, Foundations of Computational Mathematics.

[13]  M. Kilmer,et al.  Tensor-Tensor Products with Invertible Linear Transforms , 2015 .

[14]  Misha Elena Kilmer,et al.  Novel Methods for Multilinear Data Completion and De-noising Based on Tensor-SVD , 2014, 2014 IEEE Conference on Computer Vision and Pattern Recognition.

[15]  Prateek Jain,et al.  Provable Tensor Factorization with Missing Data , 2014, NIPS.

[16]  Bart Vandereycken,et al.  Low-rank tensor completion by Riemannian optimization , 2014 .

[17]  Bart Vandereycken,et al.  The geometry of algorithms using hierarchical tensors , 2013, Linear Algebra and its Applications.

[18]  Misha Elena Kilmer,et al.  Third-Order Tensors as Operators on Matrices: A Theoretical and Computational Framework with Applications in Imaging , 2013, SIAM J. Matrix Anal. Appl..

[19]  Bart Vandereycken Low-Rank Matrix Completion by Riemannian Optimization , 2012, SIAM J. Optim..

[20]  Reinhold Schneider,et al.  On manifolds of tensors of fixed TT-rank , 2012, Numerische Mathematik.

[21]  Emmanuel J. Candès,et al.  Exact Matrix Completion via Convex Optimization , 2008, Found. Comput. Math..

[22]  Jérôme Malick,et al.  Projection-like Retractions on Matrix Manifolds , 2012, SIAM J. Optim..

[23]  Ivan Oseledets,et al.  Tensor-Train Decomposition , 2011, SIAM J. Sci. Comput..

[24]  M. Kilmer,et al.  Factorization strategies for third-order tensors , 2011 .

[25]  David Gross,et al.  Recovering Low-Rank Matrices From Few Coefficients in Any Basis , 2009, IEEE Transactions on Information Theory.

[26]  Benjamin Recht,et al.  A Simpler Approach to Matrix Completion , 2009, J. Mach. Learn. Res..

[27]  Emmanuel J. Candès,et al.  Matrix Completion With Noise , 2009, Proceedings of the IEEE.

[28]  Emmanuel J. Candès,et al.  The Power of Convex Relaxation: Near-Optimal Matrix Completion , 2009, IEEE Transactions on Information Theory.

[29]  Pablo A. Parrilo,et al.  Guaranteed Minimum-Rank Solutions of Linear Matrix Equations via Nuclear Norm Minimization , 2007, SIAM Rev..

[30]  Jieping Ye,et al.  Tensor Completion for Estimating Missing Values in Visual Data , 2009, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[31]  Tamara G. Kolda,et al.  Tensor Decompositions and Applications , 2009, SIAM Rev..

[32]  Levent Tunçel,et al.  Optimization algorithms on matrix manifolds , 2009, Math. Comput..

[33]  Emmanuel J. Candès,et al.  Robust uncertainty principles: exact signal reconstruction from highly incomplete frequency information , 2004, IEEE Transactions on Information Theory.

[34]  Alan Edelman,et al.  The Geometry of Algorithms with Orthogonality Constraints , 1998, SIAM J. Matrix Anal. Appl..

[35]  Torben Hagerup,et al.  A Guided Tour of Chernoff Bounds , 1990, Inf. Process. Lett..

[36]  Johan Håstad,et al.  Tensor Rank is NP-Complete , 1989, ICALP.

[37]  L. Tucker,et al.  Some mathematical notes on three-mode factor analysis , 1966, Psychometrika.