Multilinear tensor rank estimation via Sparse Tucker Decomposition

When we apply techniques of Tucker based tensor decomposition to approximate a given tensor data as a low-rank model, appropriate multi-linear tensor rank is often unknown. In such cases, we have to tune this multi-linear tensor rank from a number of combinations. In this paper, we propose a new algorithm for sparse Tucker decomposition which estimates appropriate multilinear tensor rank of the given data. In this method, we imposed orthogonal constraint into the basis matrices and sparse constraint into the core tensor, and try to prune wasted components by maximizing the sparsity of the core tensor under the condition of error bound. Thus, we call this method as the "Pruning Sparse Tucker Decomposition" (PSTD). The PSTD is very useful for estimating the appropriate multilinear tensor rank of the Tucker based sparse representation such as compression. We demonstrate several experiments of the proposed method to show its advantages.

[1]  Tamara G. Kolda,et al.  Tensor Decompositions and Applications , 2009, SIAM Rev..

[2]  J. Chang,et al.  Analysis of individual differences in multidimensional scaling via an n-way generalization of “Eckart-Young” decomposition , 1970 .

[3]  Seungjin Choi,et al.  Nonnegative Tucker Decomposition , 2007, 2007 IEEE Conference on Computer Vision and Pattern Recognition.

[4]  Yuanyuan Liu,et al.  Generalized Higher-Order Tensor Decomposition via Parallel ADMM , 2014, AAAI.

[5]  Richard A. Harshman,et al.  Foundations of the PARAFAC procedure: Models and conditions for an "explanatory" multi-model factor analysis , 1970 .

[6]  R. Tibshirani Regression Shrinkage and Selection via the Lasso , 1996 .

[7]  Andrzej Cichocki,et al.  Fast and Efficient Algorithms for Nonnegative Tucker Decomposition , 2008, ISNN.

[8]  L. K. Hansen,et al.  Automatic relevance determination for multi‐way models , 2009 .

[9]  Syed Zubair,et al.  Tensor dictionary learning with sparse TUCKER decomposition , 2013, 2013 18th International Conference on Digital Signal Processing (DSP).

[10]  A. Cichocki 脳機能計測と生体信号入出力(第7回)Tensor Decompositions: New Concepts in Brain Data Analysis? , 2011 .

[11]  Andrzej Cichocki,et al.  Nonnegative Matrix and Tensor Factorization T , 2007 .

[12]  VandewalleJoos,et al.  On the Best Rank-1 and Rank-(R1,R2,. . .,RN) Approximation of Higher-Order Tensors , 2000 .

[13]  Y. C. Pati,et al.  Orthogonal matching pursuit: recursive function approximation with applications to wavelet decomposition , 1993, Proceedings of 27th Asilomar Conference on Signals, Systems and Computers.

[14]  Joos Vandewalle,et al.  On the Best Rank-1 and Rank-(R1 , R2, ... , RN) Approximation of Higher-Order Tensors , 2000, SIAM J. Matrix Anal. Appl..

[15]  Andrzej Cichocki,et al.  Block sparse representations of tensors using Kronecker bases , 2012, 2012 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP).

[16]  L. Tucker,et al.  Some mathematical notes on three-mode factor analysis , 1966, Psychometrika.

[17]  J. Leeuw,et al.  Principal component analysis of three-mode data by means of alternating least squares algorithms , 1980 .

[18]  Yukihiko Yamashita,et al.  Linked PARAFAC/CP Tensor Decomposition and Its Fast Implementation for Multi-block Tensor Analysis , 2012, ICONIP.

[19]  Joos Vandewalle,et al.  A Multilinear Singular Value Decomposition , 2000, SIAM J. Matrix Anal. Appl..

[20]  Andrzej Cichocki,et al.  Efficient Nonnegative Tucker Decompositions: Algorithms and Uniqueness , 2014, IEEE Transactions on Image Processing.