A Closed Form Solution to Best Rank-1 Tensor Approximation via KL divergence Minimization

Tensor decomposition is a fundamentally challenging problem. Even the simplest case of tensor decomposition, the rank-1 approximation in terms of the Least Squares (LS) error, is known to be NPhard. Here, we show that, if we consider the KL divergence instead of the LS error, we can analytically derive a closed form solution for the rank-1 tensor that minimizes the KL divergence from a given positive tensor. Our key insight is to treat a positive tensor as a probability distribution and formulate the process of rank-1 approximation as a projection onto the set of rank-1 tensors. This enables us to solve rank-1 approximation by convex optimization. We empirically demonstrate that our algorithm is an order of magnitude faster than the existing rank-1 approximation methods and gives better approximation of given tensors, which supports our theoretical finding.

[1]  Koji Tsuda,et al.  Legendre decomposition for tensors , 2018, NeurIPS.

[2]  Koji Tsuda,et al.  Tensor Balancing on Statistical Manifold , 2017, ICML.

[3]  Andy Harter,et al.  Parameterisation of a stochastic model for human face identification , 1994, Proceedings of 1994 IEEE Workshop on Applications of Computer Vision.

[4]  L. Mirsky SYMMETRIC GAUGE FUNCTIONS AND UNITARILY INVARIANT NORMS , 1960 .

[5]  F. Opitz Information geometry and its applications , 2012, 2012 9th European Radar Conference.

[6]  Tamara G. Kolda,et al.  Tensor Decompositions and Applications , 2009, SIAM Rev..

[7]  C. Eckart,et al.  The approximation of one matrix by another of lower rank , 1936 .

[8]  Peter E. Caines,et al.  Large population stochastic dynamic games: closed-loop McKean-Vlasov systems and the Nash certainty equivalence principle , 2006, Commun. Inf. Syst..

[9]  Gene H. Golub,et al.  Genericity And Rank Deficiency Of High Order Symmetric Tensors , 2006, 2006 IEEE International Conference on Acoustics Speech and Signal Processing Proceedings.

[10]  Pierre Comon,et al.  On the reduction of multivariate quadratic systems to best rank-1 approximation of three-way tensors , 2016, Appl. Math. Lett..

[11]  Tamara G. Kolda,et al.  On Tensors, Sparsity, and Nonnegative Factorizations , 2011, SIAM J. Matrix Anal. Appl..

[12]  James M. Rehg,et al.  Learning to recognize objects in egocentric activities , 2011, CVPR 2011.

[13]  Pierre Comon,et al.  Rank-1 Tensor Approximation Methods and Application to Deflation , 2015, ArXiv.

[14]  K. Back,et al.  Large investor trading impacts on volatility , 2007 .

[15]  Stefan Klus,et al.  Tensor-based algorithms for image classification , 2019, Algorithms.

[16]  Carsten Peterson,et al.  A Mean Field Theory Learning Algorithm for Neural Networks , 1987, Complex Syst..

[17]  Christopher J. Hillar,et al.  Most Tensor Problems Are NP-Hard , 2009, JACM.

[18]  Wotao Yin,et al.  A Block Coordinate Descent Method for Regularized Multiconvex Optimization with Applications to Nonnegative Tensor Factorization and Completion , 2013, SIAM J. Imaging Sci..

[19]  Joos Vandewalle,et al.  A Multilinear Singular Value Decomposition , 2000, SIAM J. Matrix Anal. Appl..

[20]  F. L. Hitchcock Multiple Invariants and Generalized Rank of a P‐Way Matrix or Tensor , 1928 .

[21]  Geoffrey E. Hinton,et al.  A Learning Algorithm for Boltzmann Machines , 1985, Cogn. Sci..

[22]  Pierre Comon,et al.  A Finite Algorithm to Compute Rank-1 Tensor Approximations , 2016, IEEE Signal Processing Letters.

[23]  Gene H. Golub,et al.  Rank-One Approximation to High Order Tensors , 2001, SIAM J. Matrix Anal. Appl..

[24]  A. Pasini,et al.  On incidence algebras. , 1977 .

[25]  Bastian Goldlücke,et al.  A Dataset and Evaluation Methodology for Depth Estimation on 4D Light Fields , 2016, ACCV.

[26]  G. Rota On the foundations of combinatorial theory I. Theory of Möbius Functions , 1964 .

[27]  Andrzej Cichocki,et al.  Fast Nonnegative Matrix/Tensor Factorization Based on Low-Rank Approximation , 2012, IEEE Transactions on Signal Processing.

[28]  Wolfgang Hackbusch,et al.  Numerical tensor calculus* , 2014, Acta Numerica.

[29]  Li Wang,et al.  Semidefinite Relaxations for Best Rank-1 Tensor Approximations , 2013, SIAM J. Matrix Anal. Appl..

[30]  S. Keerthi,et al.  Information geometry and Plefka's mean-field theory , 2000 .

[31]  VandewalleJoos,et al.  On the Best Rank-1 and Rank-(R1,R2,. . .,RN) Approximation of Higher-Order Tensors , 2000 .

[32]  Dieter Fox,et al.  A large-scale hierarchical multi-view RGB-D object dataset , 2011, 2011 IEEE International Conference on Robotics and Automation.

[33]  P. Weiss L'hypothèse du champ moléculaire et la propriété ferromagnétique , 1907 .

[34]  Max Welling,et al.  Positive tensor factorization , 2001, Pattern Recognit. Lett..

[35]  Panagiotis Symeonidis,et al.  Tag recommendations based on tensor dimensionality reduction , 2008, RecSys '08.

[36]  Daniel Kressner,et al.  A literature survey of low‐rank tensor approximation techniques , 2013, 1302.7121.

[37]  Haibin Ling,et al.  Rank-1 Tensor Approximation for High-Order Association in Multi-target Tracking , 2019, International Journal of Computer Vision.

[38]  P. Dooren,et al.  Non-negative matrix factorization with fixed row and column sums , 2008 .

[39]  Guangming Shi,et al.  Low-Rank Tensor Approximation with Laplacian Scale Mixture Modeling for Multiframe Image Denoising , 2015, 2015 IEEE International Conference on Computer Vision (ICCV).

[40]  Hassan Foroosh,et al.  Character recognition in natural scene images using rank-1 tensor decomposition , 2016, 2016 IEEE International Conference on Image Processing (ICIP).

[41]  Toshiyuki Tanaka,et al.  A Theory of Mean Field Approximation , 1998, NIPS.

[42]  S. Amari Information geometry , 2021, Japanese Journal of Mathematics.

[43]  Pat Langley,et al.  Crafting Papers on Machine Learning , 2000, ICML.

[44]  L. Tucker,et al.  Some mathematical notes on three-mode factor analysis , 1966, Psychometrika.

[45]  H. P. Annales de l'Institut Henri Poincaré , 1931, Nature.

[46]  Shutao Li,et al.  Weighted Tensor Rank-1 Decomposition for Nonlocal Image Denoising , 2019, IEEE Transactions on Image Processing.

[47]  Kanishka Bhaduri,et al.  Fast and Flexible Multivariate Time Series Subsequence Search , 2010, 2010 IEEE International Conference on Data Mining.

[48]  Johan Håstad,et al.  Tensor Rank is NP-Complete , 1989, ICALP.

[49]  Panos P. Markopoulos,et al.  The Exact Solution to Rank-1 L1-Norm TUCKER2 Decomposition , 2017, IEEE Signal Processing Letters.

[50]  Xuan Li,et al.  A Survey on Tensor Techniques and Applications in Machine Learning , 2019, IEEE Access.