Higher-dimension Tensor Completion via Low-rank Tensor Ring Decomposition

The problem of incomplete data is common in signal processing and machine learning. Tensor completion algorithms aim to recover the incomplete data from its partially observed entries. In this paper, taking advantages of high compressibility and flexibility of recently proposed tensor ring (TR) decomposition, we propose a new tensor completion approach named tensor ring weighted optimization (TR-WOPT). It finds the latent factors of the incomplete tensor by gradient descent algorithm, then the latent factors are employed to predict the missing entries of the tensor. We conduct various tensor completion experiments on synthetic data and real-world data. The simulation results show that TR-WOPT performs well in various high-dimension tensors. Furthermore, image completion results show that our proposed algorithm outperforms the state-of-the-art algorithms in many situations. Especially when the missing rate of the test images is high (e.g., over 0.9), the performance of our TR-WOPT is significantly better than the compared algorithms.

[1]  Xuelong Li,et al.  Fast and Accurate Matrix Completion via Truncated Nuclear Norm Regularization , 2013, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[2]  Nuria Oliver,et al.  Multiverse recommendation: n-dimensional tensor factorization for context-aware collaborative filtering , 2010, RecSys '10.

[3]  Andrzej Cichocki,et al.  Tensor Networks for Dimensionality Reduction and Large-scale Optimization: Part 1 Low-Rank Tensor Decompositions , 2016, Found. Trends Mach. Learn..

[4]  Anima Anandkumar,et al.  Tensor decompositions for learning latent variable models , 2012, J. Mach. Learn. Res..

[5]  Jianting Cao,et al.  Completion of High Order Tensor Data with Missing Entries via Tensor-Train Decomposition , 2017, ICONIP.

[6]  Andrzej Cichocki,et al.  Tensor Decompositions for Signal Processing Applications: From two-way to multiway component analysis , 2014, IEEE Signal Processing Magazine.

[7]  Nikos D. Sidiropoulos,et al.  Tensor Decomposition for Signal Processing and Machine Learning , 2016, IEEE Transactions on Signal Processing.

[8]  Liqing Zhang,et al.  Bayesian CP Factorization of Incomplete Tensors with Automatic Rank Determination , 2014, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[9]  Christopher J. Hillar,et al.  Most Tensor Problems Are NP-Hard , 2009, JACM.

[10]  Masashi Sugiyama,et al.  Learning Efficient Tensor Representations with Ring-structured Networks , 2019, ICASSP 2019 - 2019 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP).

[11]  Liqing Zhang,et al.  Tensor Ring Decomposition , 2016, ArXiv.

[12]  Yifan Sun,et al.  Wide Compression: Tensor Ring Nets , 2018, 2018 IEEE/CVF Conference on Computer Vision and Pattern Recognition.

[13]  Linyuan Lu,et al.  Link Prediction in Complex Networks: A Survey , 2010, ArXiv.

[14]  Qibin Zhao,et al.  Tensorizing Generative Adversarial Nets , 2018, 2018 IEEE International Conference on Consumer Electronics - Asia (ICCE-Asia).

[15]  B. Recht,et al.  Tensor completion and low-n-rank tensor recovery via convex optimization , 2011 .

[16]  H. Zhang,et al.  A tensor-based scheme for stroke patients’ motor imagery EEG analysis in BCI-FES rehabilitation training , 2014, Journal of Neuroscience Methods.

[17]  Masashi Sugiyama,et al.  Tensor Networks for Dimensionality Reduction and Large-scale Optimization: Part 2 Applications and Future Perspectives , 2017, Found. Trends Mach. Learn..

[18]  Tamara G. Kolda,et al.  Poblano v1.0: A Matlab Toolbox for Gradient-Based Optimization , 2010 .

[19]  R. Bro PARAFAC. Tutorial and applications , 1997 .

[20]  Alexander Novikov,et al.  Tensorizing Neural Networks , 2015, NIPS.

[21]  Ce Zhu,et al.  Low Rank Tensor Completion for Multiway Visual Data , 2018, 1805.03967.

[22]  Tamara G. Kolda,et al.  Scalable Tensor Factorizations for Incomplete Data , 2010, ArXiv.

[23]  Tamara G. Kolda,et al.  Tensor Decompositions and Applications , 2009, SIAM Rev..

[24]  L. Tucker,et al.  Some mathematical notes on three-mode factor analysis , 1966, Psychometrika.

[25]  Marko Filipovic,et al.  Tucker factorization with missing data with application to low-$$n$$n-rank tensor completion , 2015, Multidimens. Syst. Signal Process..

[26]  Tamir Hazan,et al.  Non-negative tensor factorization with applications to statistics and computer vision , 2005, ICML.

[27]  Liqing Zhang,et al.  Bayesian Robust Tensor Factorization for Incomplete Multiway Data , 2014, IEEE Transactions on Neural Networks and Learning Systems.

[28]  Jieping Ye,et al.  Tensor Completion for Estimating Missing Values in Visual Data , 2013, IEEE Trans. Pattern Anal. Mach. Intell..

[29]  Ivan Oseledets,et al.  Tensor-Train Decomposition , 2011, SIAM J. Sci. Comput..