Tensor Factorization for Low-Rank Tensor Completion

Recently, a tensor nuclear norm (TNN) based method was proposed to solve the tensor completion problem, which has achieved state-of-the-art performance on image and video inpainting tasks. However, it requires computing tensor singular value decomposition (t-SVD), which costs much computation and thus cannot efficiently handle tensor data, due to its natural large scale. Motivated by TNN, we propose a novel low-rank tensor factorization method for efficiently solving the 3-way tensor completion problem. Our method preserves the low-rank structure of a tensor by factorizing it into the product of two tensors of smaller sizes. In the optimization process, our method only needs to update two smaller tensors, which can be more efficiently conducted than computing t-SVD. Furthermore, we prove that the proposed alternating minimization algorithm can converge to a Karush–Kuhn–Tucker point. Experimental results on the synthetic data recovery, image and video inpainting tasks clearly demonstrate the superior performance and efficiency of our developed method over state-of-the-arts including the TNN and matricization methods.

[1]  Xuanjing Huang,et al.  Learning Context-Sensitive Word Embeddings with Neural Tensor Skip-Gram Model , 2015, IJCAI.

[2]  Morten Mørup,et al.  Applications of tensor (multiway array) factorizations and decompositions in data mining , 2011, WIREs Data Mining Knowl. Discov..

[3]  Nuria Oliver,et al.  Multiverse recommendation: n-dimensional tensor factorization for context-aware collaborative filtering , 2010, RecSys '10.

[4]  Allen Y. Yang,et al.  Robust Face Recognition via Sparse Representation , 2009, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[5]  Carla D. Martin,et al.  An Order-p Tensor Factorization with Applications in Imaging , 2013, SIAM J. Sci. Comput..

[6]  L. Tucker,et al.  Some mathematical notes on three-mode factor analysis , 1966, Psychometrika.

[7]  Yong Yu,et al.  Robust Recovery of Subspace Structures by Low-Rank Representation , 2010, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[8]  Edward Y. Chang,et al.  Exact Recoverability of Robust PCA via Outlier Pursuit with Tight Recovery Bounds , 2015, AAAI.

[9]  Dacheng Tao,et al.  DCT Regularized Extreme Visual Recovery , 2017, IEEE Transactions on Image Processing.

[10]  I. Jolliffe Principal Component Analysis , 2002 .

[11]  Emmanuel J. Candès,et al.  Exact Matrix Completion via Convex Optimization , 2009, Found. Comput. Math..

[12]  Massimiliano Pontil,et al.  Multilinear Multitask Learning , 2013, ICML.

[13]  Pierre-Antoine Absil,et al.  RTRMC: A Riemannian trust-region method for low-rank matrix completion , 2011, NIPS.

[14]  Johan A. K. Suykens,et al.  Learning with tensors: a framework based on convex optimization and spectral regularization , 2014, Machine Learning.

[15]  Yudong Chen,et al.  Incoherence-Optimal Matrix Completion , 2013, IEEE Transactions on Information Theory.

[16]  Emmanuel J. Candès,et al.  A Singular Value Thresholding Algorithm for Matrix Completion , 2008, SIAM J. Optim..

[17]  Chao Zhang,et al.  Integrated Low-Rank-Based Discriminative Feature Learning for Recognition , 2016, IEEE Transactions on Neural Networks and Learning Systems.

[18]  Vaneet Aggarwal,et al.  Low-Tubal-Rank Tensor Completion Using Alternating Minimization , 2020, IEEE Transactions on Information Theory.

[19]  Y. Zhang,et al.  Augmented Lagrangian alternating direction method for matrix separation based on low-rank factorization , 2014, Optim. Methods Softw..

[20]  Andrea Montanari,et al.  Matrix completion from a few entries , 2009, 2009 IEEE International Symposium on Information Theory.

[21]  Jieping Ye,et al.  Tensor Completion for Estimating Missing Values in Visual Data , 2013, IEEE Trans. Pattern Anal. Mach. Intell..

[22]  Massimiliano Pontil,et al.  A New Convex Relaxation for Tensor Completion , 2013, NIPS.

[23]  Jiashi Feng,et al.  Outlier-Robust Tensor PCA , 2017, 2017 IEEE Conference on Computer Vision and Pattern Recognition (CVPR).

[24]  Wotao Yin,et al.  Parallel matrix factorization for low-rank tensor completion , 2013, ArXiv.

[25]  Eric L. Miller,et al.  Tensor-Based Formulation and Nuclear Norm Regularization for Multienergy Computed Tomography , 2013, IEEE Transactions on Image Processing.

[26]  Dacheng Tao,et al.  Beyond RPCA: Flattening Complex Noise in the Frequency Domain , 2017, AAAI.

[27]  Pan Zhou,et al.  Bilevel Model-Based Discriminative Dictionary Learning for Recognition , 2017, IEEE Transactions on Image Processing.

[28]  Christopher Ré,et al.  Parallel stochastic gradient algorithms for large-scale matrix completion , 2013, Mathematical Programming Computation.

[29]  Tamara G. Kolda,et al.  Tensor Decompositions and Applications , 2009, SIAM Rev..

[30]  Yuanyuan Liu,et al.  An Efficient Matrix Factorization Method for Tensor Completion , 2013, IEEE Signal Processing Letters.

[31]  Hisashi Kashima,et al.  Statistical Performance of Convex Tensor Decomposition , 2011, NIPS.

[32]  Bo Huang,et al.  Square Deal: Lower Bounds and Improved Relaxations for Tensor Recovery , 2013, ICML.

[33]  Zemin Zhang,et al.  Exact Tensor Completion Using t-SVD , 2015, IEEE Transactions on Signal Processing.

[34]  J. Landsberg Tensors: Geometry and Applications , 2011 .

[35]  Shay B. Cohen,et al.  Tensor Decomposition for Fast Parsing with Latent-Variable PCFGs , 2012, NIPS.

[36]  Hiroyuki Kasai,et al.  Low-rank tensor completion: a Riemannian manifold preconditioning approach , 2016, ICML.

[37]  Yin Zhang,et al.  Solving a low-rank factorization model for matrix completion by a nonlinear successive over-relaxation algorithm , 2012, Mathematical Programming Computation.

[38]  Edward Y. Chang,et al.  Dictionary learning with structured noise , 2018, Neurocomputing.

[39]  H. Kiers Towards a standardized notation and terminology in multiway analysis , 2000 .

[40]  Christopher J. Hillar,et al.  Most Tensor Problems Are NP-Hard , 2009, JACM.

[41]  Bin Ran,et al.  Tensor completion via a multi-linear low-n-rank factorization model , 2014, Neurocomputing.

[42]  J. Tanner,et al.  Low rank matrix completion by alternating steepest descent methods , 2016 .

[43]  B. Recht,et al.  Tensor completion and low-n-rank tensor recovery via convex optimization , 2011 .

[44]  Xiaodong Wang,et al.  Low-Tubal-Rank Tensor Completion Using Alternating Minimization , 2016, IEEE Transactions on Information Theory.

[45]  Misha Elena Kilmer,et al.  Third-Order Tensors as Operators on Matrices: A Theoretical and Computational Framework with Applications in Imaging , 2013, SIAM J. Matrix Anal. Appl..

[46]  Ivan Oseledets,et al.  Tensor-Train Decomposition , 2011, SIAM J. Sci. Comput..

[47]  Pablo A. Parrilo,et al.  Guaranteed Minimum-Rank Solutions of Linear Matrix Equations via Nuclear Norm Minimization , 2007, SIAM Rev..

[48]  Misha Elena Kilmer,et al.  Novel Methods for Multilinear Data Completion and De-noising Based on Tensor-SVD , 2014, 2014 IEEE Conference on Computer Vision and Pattern Recognition.

[49]  Wei Liu,et al.  Tensor Robust Principal Component Analysis: Exact Recovery of Corrupted Low-Rank Tensors via Convex Optimization , 2016, 2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR).

[50]  Jitendra Malik,et al.  A database of human segmented natural images and its application to evaluating segmentation algorithms and measuring ecological statistics , 2001, Proceedings Eighth IEEE International Conference on Computer Vision. ICCV 2001.

[51]  M. Kilmer,et al.  Factorization strategies for third-order tensors , 2011 .