Image Inpainting Exploiting Tensor Train and Total Variation

In this paper, we propose a novel approach to RGB image inpainting, which recovers missing entries of image by using low rank tensor completion. The approach is based on recently proposed tensor train (TT) decomposition, which is used to effectively enforce the low rankness of the image. In addition, our approach exploits the local smooth priors of visual data by incorporating the 2D total variation. Ket augmentation (KA) scheme is used to permute the image to a high order tensor, and then low rankness of balanced KA-TT matrices and total variation (TV) norm constraints are applied to recover the missing entries of the image. In order to reduce the computational complexity, in the proposed approach, nuclear norm is replaced by minimum Frobenius norm of two factorization matrices, which reduces the time for singular value decomposition (SVD). Lastly, in order to solve the proposed model, the efficient alternating direction method of multipliers (ADMM) is developed. The results of image inpainting experiments demonstrate the significantly superior performance of our approach.

[1]  Minh N. Do,et al.  Efficient Tensor Completion for Color Image and Video Recovery: Low-Rank Tensor Train , 2016, IEEE Transactions on Image Processing.

[2]  Marc Teboulle,et al.  A Fast Iterative Shrinkage-Thresholding Algorithm for Linear Inverse Problems , 2009, SIAM J. Imaging Sci..

[3]  Huiqian Du,et al.  Minmax-concave total variation denoising , 2018, Signal Image Video Process..

[4]  Huiqian Du,et al.  Convex MR brain image reconstruction via non‐convex total variation minimization , 2018, Int. J. Imaging Syst. Technol..

[5]  Tamara G. Kolda,et al.  Tensor Decompositions and Applications , 2009, SIAM Rev..

[6]  Bin Ran,et al.  Tensor completion via a multi-linear low-n-rank factorization model , 2014, Neurocomputing.

[7]  Wen Gao,et al.  Nonlocal Gradient Sparsity Regularization for Image Restoration , 2017, IEEE Transactions on Circuits and Systems for Video Technology.

[8]  José Ignacio Latorre,et al.  Image compression and entanglement , 2005, ArXiv.

[9]  I. Daubechies,et al.  An iterative thresholding algorithm for linear inverse problems with a sparsity constraint , 2003, math/0307152.

[10]  Jieping Ye,et al.  Tensor Completion for Estimating Missing Values in Visual Data , 2013, IEEE Trans. Pattern Anal. Mach. Intell..

[11]  E. Tyrtyshnikov,et al.  TT-cross approximation for multidimensional arrays , 2010 .

[12]  Wotao Yin,et al.  Parallel matrix factorization for low-rank tensor completion , 2013, ArXiv.

[13]  Nathan Srebro,et al.  Learning with matrix factorizations , 2004 .

[14]  Ivan Oseledets,et al.  Tensor-Train Decomposition , 2011, SIAM J. Sci. Comput..

[15]  Stephen P. Boyd,et al.  Enhancing Sparsity by Reweighted ℓ1 Minimization , 2007, 0711.1612.

[16]  Qian Du,et al.  Hyperspectral Unmixing Using Sparsity-Constrained Deep Nonnegative Matrix Factorization With Total Variation , 2018, IEEE Transactions on Geoscience and Remote Sensing.