Bayesian Low Rank Tensor Ring Model for Image Completion

Low rank tensor ring model is powerful for image completion which recovers missing entries in data acquisition and transformation. The recently proposed tensor ring (TR) based completion algorithms generally solve the low rank optimization problem by alternating least squares method with predefined ranks, which may easily lead to overfitting when the unknown ranks are set too large and only a few measurements are available. In this paper, we present a Bayesian low rank tensor ring model for image completion by automatically learning the low rank structure of data. A multiplicative interaction model is developed for the low-rank tensor ring decomposition, where core factors are enforced to be sparse by assuming their entries obey Student-T distribution. Compared with most of the existing methods, the proposed one is free of parameter-tuning, and the TR ranks can be obtained by Bayesian inference. Numerical Experiments, including synthetic data, color images with different sizes and YaleFace dataset B with respect to one pose, show that the proposed approach outperforms state-of-the-art ones, especially in terms of recovery accuracy.

[1]  Yiu-Ming Cheung,et al.  Rank-One Matrix Completion With Automatic Rank Estimation via L1-Norm Regularization , 2018, IEEE Transactions on Neural Networks and Learning Systems.

[2]  Yan Yan,et al.  $L_{1}$ -Norm Low-Rank Matrix Factorization by Variational Bayesian Method , 2015, IEEE Transactions on Neural Networks and Learning Systems.

[3]  L. Tucker,et al.  Some mathematical notes on three-mode factor analysis , 1966, Psychometrika.

[4]  S. Leurgans,et al.  A Decomposition for Three-Way Arrays , 1993, SIAM J. Matrix Anal. Appl..

[5]  Jieping Ye,et al.  Tensor Completion for Estimating Missing Values in Visual Data , 2009, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[6]  Yong Hu,et al.  Classification of Diffusion Tensor Metrics for the Diagnosis of a Myelopathic Cord Using Machine Learning , 2017, Int. J. Neural Syst..

[7]  Zemin Zhang,et al.  Exact Tensor Completion Using t-SVD , 2015, IEEE Transactions on Signal Processing.

[8]  Liqing Zhang,et al.  Bayesian CP Factorization of Incomplete Tensors with Automatic Rank Determination , 2014, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[9]  V. Aggarwal,et al.  Efficient Low Rank Tensor Ring Completion , 2017, 2017 IEEE International Conference on Computer Vision (ICCV).

[10]  Felix J. Herrmann,et al.  Optimization on the Hierarchical Tucker manifold – Applications to tensor completion , 2014, Linear Algebra and its Applications.

[11]  Liqing Zhang,et al.  Tensor Ring Decomposition , 2016, ArXiv.

[12]  Hiroyuki Kasai,et al.  Low-rank tensor completion: a Riemannian manifold preconditioning approach , 2016, ICML.

[13]  Ivan Oseledets,et al.  Tensor-Train Decomposition , 2011, SIAM J. Sci. Comput..

[14]  Aggelos K. Katsaggelos,et al.  Sparse Bayesian Methods for Low-Rank Matrix Estimation , 2011, IEEE Transactions on Signal Processing.

[15]  Wensheng Zhang,et al.  The Twist Tensor Nuclear Norm for Video Completion , 2017, IEEE Transactions on Neural Networks and Learning Systems.

[16]  Yoram Singer,et al.  Shampoo: Preconditioned Stochastic Tensor Optimization , 2018, ICML.

[17]  Tamara G. Kolda,et al.  Tensor Decompositions and Applications , 2009, SIAM Rev..

[18]  Zenglin Xu,et al.  Knowledge Base Completion by Variational Bayesian Neural Tensor Decomposition , 2018, Cognitive Computation.

[19]  Jian Yang,et al.  Volume measurement based tensor completion , 2016, 2016 IEEE International Conference on Image Processing (ICIP).

[20]  Andrzej Cichocki,et al.  Tensor Decompositions for Signal Processing Applications: From two-way to multiway component analysis , 2014, IEEE Signal Processing Magazine.

[21]  Bo Huang,et al.  Square Deal: Lower Bounds and Improved Relaxations for Tensor Recovery , 2013, ICML.

[22]  Fan Liu,et al.  Low-Rank Tensor Completion by Truncated Nuclear Norm Regularization , 2017, 2018 24th International Conference on Pattern Recognition (ICPR).

[23]  Johan A. K. Suykens,et al.  A Rank-One Tensor Updating Algorithm for Tensor Completion , 2015, IEEE Signal Processing Letters.

[24]  Xiaodong Wang,et al.  Low-Tubal-Rank Tensor Completion Using Alternating Minimization , 2016, IEEE Transactions on Information Theory.

[25]  Misha Elena Kilmer,et al.  Third-Order Tensors as Operators on Matrices: A Theoretical and Computational Framework with Applications in Imaging , 2013, SIAM J. Matrix Anal. Appl..

[26]  Jitendra Malik,et al.  A database of human segmented natural images and its application to evaluating segmentation algorithms and measuring ecological statistics , 2001, Proceedings Eighth IEEE International Conference on Computer Vision. ICCV 2001.

[27]  Ce Zhu,et al.  Image Completion Using Low Tensor Tree Rank and Total Variation Minimization , 2019, IEEE Transactions on Multimedia.

[28]  Hong Cheng,et al.  Generalized Higher Order Orthogonal Iteration for Tensor Learning and Decomposition , 2016, IEEE Transactions on Neural Networks and Learning Systems.

[29]  Misha Elena Kilmer,et al.  Novel Methods for Multilinear Data Completion and De-noising Based on Tensor-SVD , 2014, 2014 IEEE Conference on Computer Vision and Pattern Recognition.

[30]  Kohei Hayashi,et al.  Tensor Decomposition with Smoothness , 2017, ICML.

[31]  Michael Steinlechner,et al.  Riemannian Optimization for High-Dimensional Tensor Completion , 2016, SIAM J. Sci. Comput..

[32]  Albert Cohen,et al.  Tensor Comprehensions: Framework-Agnostic High-Performance Machine Learning Abstractions , 2018, ArXiv.

[33]  Ce Zhu,et al.  Low Rank Tensor Completion for Multiway Visual Data , 2018, 1805.03967.

[34]  Jianting Cao,et al.  High-order tensor completion via gradient-based optimization under tensor train format , 2019, Signal Process. Image Commun..

[35]  Shiming Xiang,et al.  Discriminant Tensor Spectral–Spatial Feature Extraction for Hyperspectral Image Classification , 2015, IEEE Geoscience and Remote Sensing Letters.

[36]  Jun Fang,et al.  Fast Low-Rank Bayesian Matrix Completion With Hierarchical Gaussian Prior Models , 2017, IEEE Transactions on Signal Processing.

[37]  Karen S. Braman Third-Order Tensors as Linear Operators on a Space of Matrices , 2010 .

[38]  Xin Zhang,et al.  Parsimonious Tensor Response Regression , 2015, 1501.07815.

[39]  Pan Zhou,et al.  Tensor Factorization for Low-Rank Tensor Completion , 2018, IEEE Transactions on Image Processing.

[40]  Andrzej Cichocki,et al.  Era of Big Data Processing: A New Approach via Tensor Networks and Tensor Decompositions , 2014, ArXiv.

[41]  Minh N. Do,et al.  Efficient Tensor Completion for Color Image and Video Recovery: Low-Rank Tensor Train , 2016, IEEE Transactions on Image Processing.

[42]  Jianting Cao,et al.  Higher-dimension Tensor Completion via Low-rank Tensor Ring Decomposition , 2018, 2018 Asia-Pacific Signal and Information Processing Association Annual Summit and Conference (APSIPA ASC).

[43]  David J. Kriegman,et al.  Acquiring linear subspaces for face recognition under variable lighting , 2005, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[44]  Tamara G. Kolda,et al.  Scalable Tensor Factorizations for Incomplete Data , 2010, ArXiv.

[45]  David B. Dunson,et al.  Scalable Bayesian Low-Rank Decomposition of Incomplete Multiway Tensors , 2014, ICML.

[46]  Liqing Zhang,et al.  Bayesian Robust Tensor Factorization for Incomplete Multiway Data , 2014, IEEE Transactions on Neural Networks and Learning Systems.

[47]  David J. Kriegman,et al.  From Few to Many: Illumination Cone Models for Face Recognition under Variable Lighting and Pose , 2001, IEEE Trans. Pattern Anal. Mach. Intell..

[48]  Lars Grasedyck,et al.  Variants of Alternating Least Squares Tensor Completion in the Tensor Train Format , 2015, SIAM J. Sci. Comput..