A dual framework for trace norm regularized low-rank tensor completion

One of the popular approaches for low-rank tensor completion is to use the latent trace norm as a low-rank regularizer. However, most of the existing works learn a sparse combination of tensors. In this work, we fill this gap by proposing a variant of the latent trace norm which helps to learn a non-sparse combination of tensors. We develop a dual framework for solving the problem of latent trace norm regularized low-rank tensor completion. In this framework, we first show a novel characterization of the solution space with a novel factorization, and then, propose two scalable optimization formulations. The problems are shown to lie on a Cartesian product of Riemannian spectrahedron manifolds. We exploit the versatile Riemannian optimization framework for proposing computationally efficient trust-region algorithms. The experiments show the good performance of the proposed algorithms on several real-world data sets in different applications.

[1]  Bamdev Mishra,et al.  Manopt, a matlab toolbox for optimization on manifolds , 2013, J. Mach. Learn. Res..

[2]  Taiji Suzuki,et al.  Convex Tensor Decomposition via Structured Schatten Norm Regularization , 2013, NIPS.

[3]  Andrzej Cichocki,et al.  Tensor Decompositions for Signal Processing Applications: From two-way to multiway component analysis , 2014, IEEE Signal Processing Magazine.

[4]  Xiaodong Wang,et al.  Low-Tubal-Rank Tensor Completion Using Alternating Minimization , 2016, IEEE Transactions on Information Theory.

[5]  Bart Vandereycken,et al.  Low-rank tensor completion by Riemannian optimization , 2014 .

[6]  Hiroyuki Sato,et al.  A new, globally convergent Riemannian conjugate gradient method , 2013, 1302.0125.

[7]  Tamara G. Kolda,et al.  Scalable Tensor Factorizations with Missing Data , 2010, SDM.

[8]  Jieping Ye,et al.  Tensor Completion for Estimating Missing Values in Visual Data , 2013, IEEE Trans. Pattern Anal. Mach. Intell..

[9]  Francis R. Bach,et al.  Low-Rank Optimization on the Cone of Positive Semidefinite Matrices , 2008, SIAM J. Optim..

[10]  Hiroyuki Kasai,et al.  Riemannian stochastic variance reduced gradient , 2016, SIAM J. Optim..

[11]  Alan Edelman,et al.  The Geometry of Algorithms with Orthogonality Constraints , 1998, SIAM J. Matrix Anal. Appl..

[12]  Andrzej Cichocki,et al.  Tensor Networks for Dimensionality Reduction and Large-scale Optimization: Part 1 Low-Rank Tensor Decompositions , 2016, Found. Trends Mach. Learn..

[13]  Huan Liu,et al.  Uncoverning Groups via Heterogeneous Interaction Analysis , 2009, 2009 Ninth IEEE International Conference on Data Mining.

[14]  Tamara G. Kolda,et al.  Scalable Tensor Factorizations for Incomplete Data , 2010, ArXiv.

[15]  Linyuan Lu,et al.  Link Prediction in Complex Networks: A Survey , 2010, ArXiv.

[16]  Panagiotis Symeonidis,et al.  Tag recommendations based on tensor dimensionality reduction , 2008, RecSys '08.

[17]  Paul Tseng,et al.  Trace Norm Regularization: Reformulations, Algorithms, and Multi-Task Learning , 2010, SIAM J. Optim..

[18]  F. Maxwell Harper,et al.  The MovieLens Datasets: History and Context , 2016, TIIS.

[19]  Tommi S. Jaakkola,et al.  Primal-Dual methods for sparse constrained matrix completion , 2012, AISTATS.

[20]  Suvrit Sra,et al.  Fast stochastic optimization on Riemannian manifolds , 2016, ArXiv.

[21]  James T. Kwok,et al.  Efficient Sparse Low-Rank Tensor Completion Using the Frank-Wolfe Algorithm , 2017, AAAI.

[22]  Masashi Sugiyama,et al.  Multitask learning meets tensor factorization: task imputation via convex optimization , 2014, NIPS.

[23]  Misha Elena Kilmer,et al.  Novel Methods for Multilinear Data Completion and De-noising Based on Tensor-SVD , 2014, 2014 IEEE Conference on Computer Vision and Pattern Recognition.

[24]  Robert E. Mahony,et al.  Optimization Algorithms on Matrix Manifolds , 2007 .

[25]  Massimiliano Pontil,et al.  Multi-Task Feature Learning , 2006, NIPS.

[26]  Tamara G. Kolda,et al.  Tensor Decompositions and Applications , 2009, SIAM Rev..

[27]  Kinjiro Amano,et al.  Information limits on neural identification of colored surfaces in natural scenes , 2004, Visual Neuroscience.

[28]  Massimiliano Pontil,et al.  Multilinear Multitask Learning , 2013, ICML.

[29]  Johan A. K. Suykens,et al.  Learning with tensors: a framework based on convex optimization and spectral regularization , 2014, Machine Learning.

[30]  Ryota Tomioka,et al.  Estimation of low-rank tensors via convex optimization , 2010, 1010.0789.

[31]  Michael Gamon,et al.  Representing Text for Joint Embedding of Text and Knowledge Bases , 2015, EMNLP.

[32]  Marko Filipovic,et al.  Tucker factorization with missing data with application to low-$$n$$n-rank tensor completion , 2015, Multidimens. Syst. Signal Process..

[33]  Ali Taylan Cemgil,et al.  Link prediction in heterogeneous data via generalized coupled tensor factorization , 2013, Data Mining and Knowledge Discovery.

[34]  Mehryar Mohri,et al.  L2 Regularization for Learning Kernels , 2009, UAI.

[35]  Dimitri P. Bertsekas,et al.  Nonlinear Programming , 1997 .

[36]  Liqing Zhang,et al.  Bayesian CP Factorization of Incomplete Tensors with Automatic Rank Determination , 2014, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[37]  Jason Weston,et al.  Translating Embeddings for Modeling Multi-relational Data , 2013, NIPS.

[38]  Hiroyuki Kasai,et al.  Low-rank tensor completion: a Riemannian manifold preconditioning approach , 2016, ICML.

[39]  Andrzej Cichocki,et al.  Stable, Robust, and Super Fast Reconstruction of Tensors Using Multi-Way Projections , 2014, IEEE Transactions on Signal Processing.

[40]  Bamdev Mishra,et al.  A Saddle Point Approach to Structured Low-rank Matrix Learning in Large-scale Applications , 2017, ArXiv.

[41]  J. Frédéric Bonnans,et al.  Perturbation Analysis of Optimization Problems , 2000, Springer Series in Operations Research.

[42]  Taiji Suzuki,et al.  Unifying Framework for Fast Learning Rate of Non-Sparse Multiple Kernel Learning , 2011, NIPS.

[43]  Yaoliang Yu,et al.  Scalable and Sound Low-Rank Tensor Learning , 2016, AISTATS.

[44]  Masashi Sugiyama,et al.  Tensor Networks for Dimensionality Reduction and Large-scale Optimization: Part 2 Applications and Future Perspectives , 2017, Found. Trends Mach. Learn..