Efficient Low Rank Tensor Ring Completion

Using the matrix product state (MPS) representation of the recently proposed tensor ring (TR) decompositions, in this paper we propose a TR completion algorithm, which is an alternating minimization algorithm that alternates over the factors in the MPS representation. This development is motivated in part by the success of matrix completion algorithms that alternate over the (low-rank) factors. We propose a novel initialization method and analyze the computational complexity of the TR completion algorithm. The numerical comparison between the TR completion algorithm and the existing algorithms that employ a low rank tensor train (TT) approximation for data completion shows that our method outperforms the existing ones for a variety of real computer vision settings, and thus demonstrates the improved expressive power of tensor ring as compared to tensor train.

[1]  Demetri Terzopoulos,et al.  Multilinear image analysis for facial recognition , 2002, Object recognition supported by user interaction for service robots.

[2]  Tamara G. Kolda,et al.  Tensor Decompositions and Applications , 2009, SIAM Rev..

[3]  Ivan Oseledets,et al.  Tensor-Train Decomposition , 2011, SIAM J. Sci. Comput..

[4]  Reinhold Schneider,et al.  On manifolds of tensors of fixed TT-rank , 2012, Numerische Mathematik.

[5]  Roman Orus,et al.  A Practical Introduction to Tensor Networks: Matrix Product States and Projected Entangled Pair States , 2013, 1306.2164.

[6]  Moritz Hardt,et al.  On the Provable Convergence of Alternating Minimization for Matrix Completion , 2013, ArXiv.

[7]  Prateek Jain,et al.  Low-rank matrix completion using alternating minimization , 2012, STOC '13.

[8]  Andrzej Cichocki,et al.  Tensor Networks for Big Data Analytics and Large-Scale Optimization Problems , 2014, ArXiv.

[9]  Andrzej Cichocki,et al.  Tensor Decompositions for Signal Processing Applications: From two-way to multiway component analysis , 2014, IEEE Signal Processing Magazine.

[10]  Lars Grasedyck,et al.  Variants of Alternating Least Squares Tensor Completion in the Tensor Train Format , 2015, SIAM J. Sci. Comput..

[11]  Alexander Novikov,et al.  Tensorizing Neural Networks , 2015, NIPS.

[12]  Liqing Zhang,et al.  Tensor Ring Decomposition , 2016, ArXiv.

[13]  Minh N. Do,et al.  Efficient tensor completion: Low-rank tensor train , 2016, ArXiv.

[14]  Jimeng Sun,et al.  Model-Driven Sparse CP Decomposition for Higher-Order Tensors , 2017, 2017 IEEE International Parallel and Distributed Processing Symposium (IPDPS).

[15]  Xiaodong Wang,et al.  Deterministic and Probabilistic Conditions for Finite Completability of Low-Tucker-Rank Tensor , 2016, IEEE Transactions on Information Theory.