Tensor completion in hierarchical tensor representations

Compressed sensing extends from the recovery of sparse vectors from undersampled measurements via efficient algorithms to the recovery of matrices of low rank from incomplete information. Here we consider a further extension to the reconstruction of tensors of low multi-linear rank in recently introduced hierarchical tensor formats from a small number of measurements. Hierarchical tensors are a flexible generalization of the well-known Tucker representation, which have the advantage that the number of degrees of freedom of a low rank tensor does not scale exponentially with the order of the tensor. While corresponding tensor decompositions can be computed efficiently via successive applications of (matrix) singular value decompositions, some important properties of the singular value decomposition do not extend from the matrix to the tensor case. This results in major computational and theoretical difficulties in designing and analyzing algorithms for low rank tensor recovery. For instance, a canonical analogue of the tensor nuclear norm is NP-hard to compute in general, which is in stark contrast to the matrix case. In this book chapter we consider versions of iterative hard thresholding schemes adapted to hierarchical tensor formats. A variant builds on methods from Riemannian optimization and uses a retraction mapping from the tangent space of the manifold of low rank tensors back to this manifold. We provide first partial convergence results based on a tensor version of the restricted isometry property (TRIP) of the measurement map. Moreover, an estimate of the number of measurements is provided that ensures the TRIP of a given tensor rank with high probability for Gaussian measurement maps.

[1]  Reinhold Schneider,et al.  Convergence Results for Projected Line-Search Methods on Varieties of Low-Rank Matrices Via Łojasiewicz Inequality , 2014, SIAM J. Optim..

[2]  Gitta Kutyniok,et al.  1 . 2 Sparsity : A Reasonable Assumption ? , 2012 .

[3]  Roman Vershynin,et al.  Introduction to the non-asymptotic analysis of random matrices , 2010, Compressed Sensing.

[4]  U. Schollwoeck The density-matrix renormalization group in the age of matrix product states , 2010, 1008.3477.

[5]  John J. Benedetto,et al.  Applied and numerical harmonic analysis , 1997 .

[6]  J. Suykens,et al.  Nuclear Norms for Tensors and Their Use for Convex Multilinear Estimation , 2011 .

[7]  Reinhold Schneider,et al.  Tensor Spaces and Hierarchical Tensor Representations , 2014 .

[8]  慧 廣瀬 A Mathematical Introduction to Compressive Sensing , 2015 .

[9]  Emmanuel J. Candès,et al.  The Power of Convex Relaxation: Near-Optimal Matrix Completion , 2009, IEEE Transactions on Information Theory.

[10]  T. Blumensath,et al.  Iterative Thresholding for Sparse Approximations , 2008 .

[11]  Nadia Kreimer,et al.  A tensor higher-order singular value decomposition for prestack seismic data noise reduction and interpolation , 2012 .

[12]  Jieping Ye,et al.  Tensor Completion for Estimating Missing Values in Visual Data , 2013, IEEE Trans. Pattern Anal. Mach. Intell..

[13]  Mike E. Davies,et al.  Iterative Hard Thresholding for Compressed Sensing , 2008, ArXiv.

[14]  Felix J. Herrmann,et al.  Hierarchical Tucker Tensor Optimization - Applications to 4D Seismic Data Interpolation , 2013 .

[15]  Christopher J. Hillar,et al.  Most Tensor Problems Are NP-Hard , 2009, JACM.

[16]  C. Lubich From Quantum to Classical Molecular Dynamics: Reduced Models and Numerical Analysis , 2008 .

[17]  Yuanyuan Liu,et al.  An Efficient Matrix Factorization Method for Tensor Completion , 2013, IEEE Signal Processing Letters.

[18]  Enrico Carlini,et al.  Ranks derived from multilinear maps , 2011 .

[19]  Johan Håstad,et al.  Tensor Rank is NP-Complete , 1989, ICALP.

[20]  Haobin Wang,et al.  Multilayer formulation of the multiconfiguration time-dependent Hartree theory , 2003 .

[21]  Massimiliano Pontil,et al.  A New Convex Relaxation for Tensor Completion , 2013, NIPS.

[22]  Eugene E. Tyrtyshnikov,et al.  Breaking the Curse of Dimensionality, Or How to Use SVD in Many Dimensions , 2009, SIAM J. Sci. Comput..

[23]  Wotao Yin,et al.  Parallel matrix factorization for low-rank tensor completion , 2013, ArXiv.

[24]  Wotao Yin,et al.  A Block Coordinate Descent Method for Regularized Multiconvex Optimization with Applications to Nonnegative Tensor Factorization and Completion , 2013, SIAM J. Imaging Sci..

[25]  Reinhold Schneider,et al.  Dynamical Approximation by Hierarchical Tucker and Tensor-Train Tensors , 2013, SIAM J. Matrix Anal. Appl..

[26]  M. Beck,et al.  The multiconfiguration time-dependent Hartree (MCTDH) method: A highly efficient algorithm for propa , 1999 .

[27]  Emmanuel J. Candès,et al.  Tight oracle bounds for low-rank matrix recovery from a minimal number of random measurements , 2010, ArXiv.

[28]  Reinhold Schneider,et al.  Low rank tensor recovery via iterative hard thresholding , 2016, ArXiv.

[29]  John Wright,et al.  Provable Low-Rank Tensor Recovery , 2014 .

[30]  G. Vidal Efficient classical simulation of slightly entangled quantum computations. , 2003, Physical review letters.

[31]  B. Recht,et al.  Tensor completion and low-n-rank tensor recovery via convex optimization , 2011 .

[32]  Reinhold Schneider,et al.  Approximation rates for the hierarchical tensor format in periodic Sobolev spaces , 2014, J. Complex..

[33]  Bart Vandereycken,et al.  Low-Rank Matrix Completion by Riemannian Optimization , 2013, SIAM J. Optim..

[34]  David Gross,et al.  Recovering Low-Rank Matrices From Few Coefficients in Any Basis , 2009, IEEE Transactions on Information Theory.

[35]  C. D. Silva Hierarchical Tucker Tensor Optimization-Applications to Tensor Completion , 2013 .

[36]  Eugene E. Tyrtyshnikov,et al.  Algebraic Wavelet Transform via Quantics Tensor Train Decomposition , 2011, SIAM J. Sci. Comput..

[37]  André Uschmajew,et al.  On Local Convergence of Alternating Schemes for Optimization of Convex Problems in the Tensor Train Format , 2013, SIAM J. Numer. Anal..

[38]  Artur Przelaskowski,et al.  Reviews of "Cloud computing: automating the virtualized data center" (josyula, v., et al; 2012) and " compressed sensing: theory and applications" (eldar, y.c. and kutyniok, g.; 2012) [book reviews] , 2013, IEEE Communications Magazine.

[39]  Tamara G. Kolda,et al.  Tensor Decompositions and Applications , 2009, SIAM Rev..

[40]  Bo Huang,et al.  Square Deal: Lower Bounds and Improved Relaxations for Tensor Recovery , 2013, ICML.

[41]  Bart Vandereycken,et al.  Low-rank tensor completion by Riemannian optimization , 2014 .

[42]  Joos Vandewalle,et al.  A Multilinear Singular Value Decomposition , 2000, SIAM J. Matrix Anal. Appl..

[43]  Martin J. Mohlenkamp,et al.  Multivariate Regression and Machine Learning with Sums of Separable Functions , 2009, SIAM J. Sci. Comput..

[44]  Johan A. K. Suykens,et al.  Learning with tensors: a framework based on convex optimization and spectral regularization , 2014, Machine Learning.

[45]  F. Verstraete,et al.  Post-matrix product state methods: To tangent space and beyond , 2013, 1305.1894.

[46]  Vin de Silva,et al.  Tensor rank and the ill-posedness of the best low-rank approximation problem , 2006, math/0607647.

[47]  Wolfgang Hackbusch,et al.  Numerical tensor calculus* , 2014, Acta Numerica.

[48]  C. Harris Problems in measuring change , 1965 .

[49]  Bart Vandereycken,et al.  The geometry of algorithms using hierarchical tensors , 2013, Linear Algebra and its Applications.

[50]  Emmanuel J. Candès,et al.  Exact Matrix Completion via Convex Optimization , 2009, Found. Comput. Math..

[51]  Antonio Falcó,et al.  On Minimal Subspaces in Tensor Representations , 2012, Found. Comput. Math..

[52]  Wolfgang Hackbusch,et al.  Tensorisation of vectors and their efficient convolution , 2011, Numerische Mathematik.

[53]  Daniel Kressner,et al.  A literature survey of low‐rank tensor approximation techniques , 2013, 1302.7121.

[54]  Harold Gulliksen,et al.  Contributions to mathematical psychology , 1964 .

[55]  J. Levin Three-mode factor analysis. , 1965, Psychological bulletin.

[56]  L. Tucker,et al.  Some mathematical notes on three-mode factor analysis , 1966, Psychometrika.

[57]  Nickolay T. Trendafilov,et al.  P.-A. Absil, R. Mahony, and R. Sepulchre. Optimization Algorithms on Matrix Manifolds , 2010, Found. Comput. Math..

[58]  K. Brown,et al.  Graduate Texts in Mathematics , 1982 .

[59]  Antonio Falcó,et al.  Geometric structures in tensor representations , 2013 .

[60]  W. Hackbusch,et al.  A New Scheme for the Tensor Representation , 2009 .

[61]  Ivan Oseledets,et al.  A new tensor decomposition , 2009 .

[62]  Benjamin Recht,et al.  A Simpler Approach to Matrix Completion , 2009, J. Mach. Learn. Res..

[63]  Reinhold Schneider,et al.  On manifolds of tensors of fixed TT-rank , 2012, Numerische Mathematik.

[64]  Robert E. Mahony,et al.  Optimization Algorithms on Matrix Manifolds , 2007 .

[65]  Tobias Jahnke,et al.  On the approximation of high-dimensional differential equations in the hierarchical Tucker format , 2013, BIT Numerical Mathematics.

[66]  Ivan Oseledets,et al.  Tensor-Train Decomposition , 2011, SIAM J. Sci. Comput..

[67]  Pablo A. Parrilo,et al.  Guaranteed Minimum-Rank Solutions of Linear Matrix Equations via Nuclear Norm Minimization , 2007, SIAM Rev..

[68]  J. Landsberg Tensors: Geometry and Applications , 2011 .

[69]  Emmanuel J. Candès,et al.  Tight Oracle Inequalities for Low-Rank Matrix Recovery From a Minimal Number of Noisy Random Measurements , 2011, IEEE Transactions on Information Theory.

[70]  White,et al.  Density matrix formulation for quantum renormalization groups. , 1992, Physical review letters.

[71]  Reinhold Schneider,et al.  Tensor Product Approximation (DMRG) and Coupled Cluster method in Quantum Chemistry , 2013, 1310.2736.

[72]  Shmuel Friedland,et al.  The Number of Singular Vector Tuples and Uniqueness of Best Rank-One Approximation of Tensors , 2012, Found. Comput. Math..

[73]  Martin J. Mohlenkamp,et al.  Algorithms for Numerical Analysis in High Dimensions , 2005, SIAM J. Sci. Comput..

[74]  Charles R. Johnson,et al.  Matrix analysis , 1985, Statistical Inference for Engineers and Data Scientists.

[75]  Yin Zhang,et al.  Solving a low-rank factorization model for matrix completion by a nonlinear successive over-relaxation algorithm , 2012, Mathematical Programming Computation.

[76]  Lars Grasedyck,et al.  Hierarchical Singular Value Decomposition of Tensors , 2010, SIAM J. Matrix Anal. Appl..

[77]  W. Hackbusch Tensor Spaces and Numerical Tensor Calculus , 2012, Springer Series in Computational Mathematics.

[78]  Jared Tanner,et al.  Normalized Iterative Hard Thresholding for Matrix Completion , 2013, SIAM J. Sci. Comput..

[79]  André Uschmajew,et al.  Well-posedness of convex maximization problems on Stiefel manifolds and orthogonal tensor product approximations , 2010, Numerische Mathematik.

[80]  Reinhold Schneider,et al.  The Alternating Linear Scheme for Tensor Optimization in the Tensor Train Format , 2012, SIAM J. Sci. Comput..