Tensor Decomposition with Smoothness

Real data tensors are typically high dimensional; however, their intrinsic information is preserved in low-dimensional space, which motivates the use of tensor decompositions such as Tucker decomposition. Frequently, real data tensors smooth in addition to being low dimensional, which implies that adjacent elements are similar or continuously changing. These elements typically appear as spatial or temporal data. We propose smoothed Tucker decomposition (STD) to incorporate the smoothness property. STD leverages smoothness using the sum of a few basis functions; this reduces the number of parameters. An objective function is formulated as a convex problem, and an algorithm based on the alternating direction method of multipliers is derived to solve the problem. We theoretically show that, under the smoothness assumption, STD achieves a better error bound. The theoretical result and performances of STD are numerically verified.

[1]  Hongtu Zhu,et al.  Tensor Regression with Applications in Neuroimaging Data Analysis , 2012, Journal of the American Statistical Association.

[2]  Taiji Suzuki,et al.  Convex Tensor Decomposition via Structured Schatten Norm Regularization , 2013, NIPS.

[3]  P. Bickel,et al.  SIMULTANEOUS ANALYSIS OF LASSO AND DANTZIG SELECTOR , 2008, 0801.1095.

[4]  Emmanuel J. Candès,et al.  Matrix Completion With Noise , 2009, Proceedings of the IEEE.

[5]  Lars Schmidt-Thieme,et al.  Learning Attribute-to-Feature Mappings for Cold-Start Recommendations , 2010, 2010 IEEE International Conference on Data Mining.

[6]  W. Hackbusch Tensor Spaces and Numerical Tensor Calculus , 2012, Springer Series in Computational Mathematics.

[7]  Hisashi Kashima,et al.  Tensor factorization using auxiliary information , 2011, Data Mining and Knowledge Discovery.

[8]  Guoqiang Zhong,et al.  Low-Rank Tensor Learning with Discriminant Analysis for Action Classification and Image Recovery , 2014, AAAI.

[9]  Tamara G. Kolda,et al.  Tensor Decompositions and Applications , 2009, SIAM Rev..

[10]  B. Silverman,et al.  Functional Data Analysis , 1997 .

[11]  L. Tucker,et al.  Some mathematical notes on three-mode factor analysis , 1966, Psychometrika.

[12]  Martin J. Wainwright,et al.  Estimation of (near) low-rank matrices with noise and high-dimensional scaling , 2009, ICML.

[13]  Henk A. L. Kiers,et al.  A three–step algorithm for CANDECOMP/PARAFAC analysis of large data sets with multicollinearity , 1998 .

[14]  Stephen P. Boyd,et al.  Distributed Optimization and Statistical Learning via the Alternating Direction Method of Multipliers , 2011, Found. Trends Mach. Learn..

[15]  Richard A. Harshman,et al.  Foundations of the PARAFAC procedure: Models and conditions for an "explanatory" multi-model factor analysis , 1970 .

[16]  Barbara Caputo,et al.  Recognizing human actions: a local SVM approach , 2004, Proceedings of the 17th International Conference on Pattern Recognition, 2004. ICPR 2004..

[17]  Alexandre B. Tsybakov,et al.  Introduction to Nonparametric Estimation , 2008, Springer series in statistics.

[18]  Lawrence Carin,et al.  Zero-Truncated Poisson Tensor Factorization for Massive Binary Tensors , 2015, UAI.

[19]  Emmanuel J. Candès,et al.  Exact Matrix Completion via Convex Optimization , 2009, Found. Comput. Math..

[20]  Martin J. Wainwright,et al.  Restricted Eigenvalue Properties for Correlated Gaussian Designs , 2010, J. Mach. Learn. Res..

[21]  E. Levina,et al.  Structured functional regression models for high-dimensional spatial spectroscopy data , 2013, 1311.0416.

[22]  Tamara G. Kolda,et al.  Scalable Tensor Decompositions for Multi-aspect Data Mining , 2008, 2008 Eighth IEEE International Conference on Data Mining.

[23]  J. Suykens,et al.  Nuclear Norms for Tensors and Their Use for Convex Multilinear Estimation , 2011 .

[24]  Yukihiko Yamashita,et al.  Smooth nonnegative matrix and tensor factorizations for robust multi-way data analysis , 2015, Signal Process..

[25]  Yaoliang Yu,et al.  Scalable and Sound Low-Rank Tensor Learning , 2016, AISTATS.

[26]  Johan A. K. Suykens,et al.  Tensor Versus Matrix Completion: A Comparison With Application to Spectral Data , 2011, IEEE Signal Processing Letters.

[27]  Jieping Ye,et al.  Tensor Completion for Estimating Missing Values in Visual Data , 2013, IEEE Trans. Pattern Anal. Mach. Intell..

[28]  Masashi Sugiyama,et al.  Multitask learning meets tensor factorization: task imputation via convex optimization , 2014 .

[29]  Ryota Tomioka,et al.  Estimation of low-rank tensors via convex optimization , 2010, 1010.0789.

[30]  B. Recht,et al.  Tensor completion and low-n-rank tensor recovery via convex optimization , 2011 .

[31]  Hisashi Kashima,et al.  Statistical Performance of Convex Tensor Decomposition , 2011, NIPS.

[32]  Andrzej Cichocki,et al.  Smooth PARAFAC Decomposition for Tensor Completion , 2015, IEEE Transactions on Signal Processing.