Tensor Ring Decomposition with Rank Minimization on Latent Space: An Efficient Approach for Tensor Completion

In tensor completion tasks, the traditional low-rank tensor decomposition models suffer from the laborious model selection problem due to their high model sensitivity. In particular, for tensor ring (TR) decomposition, the number of model possibilities grows exponentially with the tensor order, which makes it rather challenging to find the optimal TR decomposition. In this paper, by exploiting the low-rank structure of the TR latent space, we propose a novel tensor completion method which is robust to model selection. In contrast to imposing the low-rank constraint on the data space, we introduce nuclear norm regularization on the latent TR factors, resulting in the optimization step using singular value decomposition (SVD) being performed at a much smaller scale. By leveraging the alternating direction method of multipliers (ADMM) scheme, the latent TR factors with optimal rank and the recovered tensor can be obtained simultaneously. Our proposed algorithm is shown to effectively alleviate the burden of TR-rank selection, thereby greatly reducing the computational cost. The extensive experimental results on both synthetic and real-world data demonstrate the superior performance and efficiency of the proposed approach against the state-of-the-art algorithms.

[1]  B. Recht,et al.  Tensor completion and low-n-rank tensor recovery via convex optimization , 2011 .

[2]  Prateek Jain,et al.  Provable Tensor Factorization with Missing Data , 2014, NIPS.

[3]  Johan A. K. Suykens,et al.  Learning with tensors: a framework based on convex optimization and spectral regularization , 2014, Machine Learning.

[4]  Marko Filipovic,et al.  Tucker factorization with missing data with application to low-$$n$$n-rank tensor completion , 2015, Multidimens. Syst. Signal Process..

[5]  Yifan Sun,et al.  Wide Compression: Tensor Ring Nets , 2018, 2018 IEEE/CVF Conference on Computer Vision and Pattern Recognition.

[6]  Tamara G. Kolda,et al.  Tensor Decompositions and Applications , 2009, SIAM Rev..

[7]  John B. Shoven,et al.  I , Edinburgh Medical and Surgical Journal.

[8]  Misha Elena Kilmer,et al.  Novel Methods for Multilinear Data Completion and De-noising Based on Tensor-SVD , 2014, 2014 IEEE Conference on Computer Vision and Pattern Recognition.

[10]  Emmanuel J. Candès,et al.  A Singular Value Thresholding Algorithm for Matrix Completion , 2008, SIAM J. Optim..

[11]  Alexander Novikov,et al.  Tensorizing Neural Networks , 2015, NIPS.

[12]  Danna Zhou,et al.  d. , 1934, Microbial pathogenesis.

[13]  Takanori Maehara,et al.  On Tensor Train Rank Minimization : Statistical Efficiency and Scalable Algorithm , 2017, NIPS.

[14]  Taiji Suzuki,et al.  Gaussian process nonparametric tensor estimator and its minimax optimality , 2016, ICML.

[15]  Jianting Cao,et al.  Higher-dimension Tensor Completion via Low-rank Tensor Ring Decomposition , 2018, 2018 Asia-Pacific Signal and Information Processing Association Annual Summit and Conference (APSIPA ASC).

[16]  Wotao Yin,et al.  Parallel matrix factorization for low-rank tensor completion , 2013, ArXiv.

[17]  Lars Grasedyck,et al.  Variants of Alternating Least Squares Tensor Completion in the Tensor Train Format , 2015, SIAM J. Sci. Comput..

[18]  Anima Anandkumar,et al.  Tensor decompositions for learning latent variable models , 2012, J. Mach. Learn. Res..

[19]  Jianting Cao,et al.  Completion of High Order Tensor Data with Missing Entries via Tensor-Train Decomposition , 2017, ICONIP.

[20]  Liqing Zhang,et al.  Bayesian CP Factorization of Incomplete Tensors with Automatic Rank Determination , 2014, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[21]  Liqing Zhang,et al.  Bayesian Robust Tensor Factorization for Incomplete Multiway Data , 2014, IEEE Transactions on Neural Networks and Learning Systems.

[22]  Masashi Sugiyama,et al.  Learning Efficient Tensor Representations with Ring-structured Networks , 2019, ICASSP 2019 - 2019 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP).

[23]  Liqing Zhang,et al.  Tensor Ring Decomposition , 2016, ArXiv.

[24]  G. G. Stokes "J." , 1890, The New Yale Book of Quotations.

[25]  Ivan Oseledets,et al.  Tensor-Train Decomposition , 2011, SIAM J. Sci. Comput..

[26]  Hong Cheng,et al.  Generalized Higher-Order Orthogonal Iteration for Tensor Decomposition and Completion , 2014, NIPS.

[27]  WonkaPeter,et al.  Tensor Completion for Estimating Missing Values in Visual Data , 2013 .

[28]  Stephen P. Boyd,et al.  Distributed Optimization and Statistical Learning via the Alternating Direction Method of Multipliers , 2011, Found. Trends Mach. Learn..

[29]  Xiaofeng Gong,et al.  Tensor decomposition of EEG signals: A brief review , 2015, Journal of Neuroscience Methods.

[30]  Jieping Ye,et al.  Tensor Completion for Estimating Missing Values in Visual Data , 2009, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[31]  Christopher J. Hillar,et al.  Most Tensor Problems Are NP-Hard , 2009, JACM.

[32]  Hong Cheng,et al.  Trace Norm Regularized CANDECOMP/PARAFAC Decomposition With Missing Data , 2015, IEEE Transactions on Cybernetics.

[33]  Tamara G. Kolda,et al.  Scalable Tensor Factorizations for Incomplete Data , 2010, ArXiv.

[34]  V. Aggarwal,et al.  Efficient Low Rank Tensor Ring Completion , 2017, 2017 IEEE International Conference on Computer Vision (ICCV).