Variable Projection Applied to Block Term Decomposition of Higher-Order Tensors

Higher-order tensors have become popular in many areas of applied mathematics such as statistics, scientific computing, signal processing or machine learning, notably thanks to the many possible ways of decomposing a tensor. In this paper, we focus on the best approximation in the least-squares sense of a higher-order tensor by a block term decomposition. Using variable projection, we express the tensor approximation problem as a minimization of a cost function on a Cartesian product of Stiefel manifolds. The effect of variable projection on the Riemannian gradient algorithm is studied through numerical experiments.

[1]  Lieven De Lathauwer,et al.  Decompositions of a Higher-Order Tensor in Block Terms - Part I: Lemmas for Partitioned Matrices , 2008, SIAM J. Matrix Anal. Appl..

[2]  Masashi Sugiyama,et al.  Tensor Networks for Dimensionality Reduction and Large-scale Optimization: Part 2 Applications and Future Perspectives , 2017, Found. Trends Mach. Learn..

[3]  Robert E. Mahony,et al.  Optimization Algorithms on Matrix Manifolds , 2007 .

[4]  Nikos D. Sidiropoulos,et al.  Tensor Decomposition for Signal Processing and Machine Learning , 2016, IEEE Transactions on Signal Processing.

[5]  Wim Van Paesschen,et al.  Block term decomposition for modelling epileptic seizures , 2014, EURASIP J. Adv. Signal Process..

[6]  Lieven De Lathauwer,et al.  Decompositions of a Higher-Order Tensor in Block Terms - Part II: Definitions and Uniqueness , 2008, SIAM J. Matrix Anal. Appl..

[7]  Berkant Savas,et al.  Quasi-Newton Methods on Grassmannians and Multilinear Approximations of Tensors , 2009, SIAM J. Sci. Comput..

[8]  Sergios Theodoridis,et al.  PARAFAC2 and its block term decomposition analog for blind fMRI source unmixing , 2017, 2017 25th European Signal Processing Conference (EUSIPCO).

[9]  Andrzej Cichocki,et al.  Tensor Networks for Dimensionality Reduction and Large-scale Optimization: Part 1 Low-Rank Tensor Decompositions , 2016, Found. Trends Mach. Learn..

[10]  P. Cochat,et al.  Et al , 2008, Archives de pediatrie : organe officiel de la Societe francaise de pediatrie.

[11]  Lieven De Lathauwer,et al.  A variable projection method for block term decomposition of higher-order tensors , 2018, ESANN.

[12]  Lieven De Lathauwer,et al.  Löwner-Based Blind Signal Separation of Rational Functions With Applications , 2016, IEEE Transactions on Signal Processing.

[13]  Sergios Theodoridis,et al.  Higher-Order Block Term Decomposition for Spatially Folded fMRI Data , 2016, LVA/ICA.

[14]  Chong Peng,et al.  On Block Term Tensor Decompositions and Its Applications in Blind Signal Separation , 2016 .

[15]  Andrzej Cichocki,et al.  Tensor Decompositions for Signal Processing Applications: From two-way to multiway component analysis , 2014, IEEE Signal Processing Magazine.

[16]  Lieven De Lathauwer,et al.  Blind Separation of Exponential Polynomials and the Decomposition of a Tensor in Rank-(Lr, Lr, 1) Terms , 2011, SIAM J. Matrix Anal. Appl..

[17]  Joos Vandewalle,et al.  On the Best Rank-1 and Rank-(R1 , R2, ... , RN) Approximation of Higher-Order Tensors , 2000, SIAM J. Matrix Anal. Appl..

[18]  Lieven De Lathauwer,et al.  Decompositions of a Higher-Order Tensor in Block Terms - Part III: Alternating Least Squares Algorithms , 2008, SIAM J. Matrix Anal. Appl..

[19]  Sabine Van Huffel,et al.  Best Low Multilinear Rank Approximation of Higher-Order Tensors, Based on the Riemannian Trust-Region Scheme , 2011, SIAM J. Matrix Anal. Appl..

[20]  Lieven De Lathauwer,et al.  Block Component Analysis, a New Concept for Blind Source Separation , 2012, LVA/ICA.