Era of Big Data Processing: A New Approach via Tensor Networks and Tensor Decompositions

Many problems in computational neuroscience, neuroinformatics, pattern/image recognition, signal processing and machine learning generate massive amounts of multidimensional data with multiple aspects and high dimensionality. Tensors (i.e., multi-way arrays) provide often a natural and compact representation for such massive multidimensional data via suitable low-rank approximations. Big data analytics require novel technologies to efficiently process huge datasets within tolerable elapsed times. Such a new emerging technology for multidimensional big data is a multiway analysis via tensor networks (TNs) and tensor decompositions (TDs) which represent tensors by sets of factor (component) matrices and lower-order (core) tensors. Dynamic tensor analysis allows us to discover meaningful hidden structures of complex data and to perform generalizations by capturing multi-linear and multi-aspect relationships. We will discuss some fundamental TN models, their mathematical and graphical descriptions and associated learning algorithms for large-scale TDs and TNs, with many potential applications including: Anomaly detection, feature extraction, classification, cluster analysis, data fusion and integration, pattern recognition, predictive modeling, regression, time series analysis and multiway component analysis. Keywords: Large-scale HOSVD, Tensor decompositions, CPD, Tucker models, Hierarchical Tucker (HT) decomposition, low-rank tensor approximations (LRA), Tensorization/Quantization, tensor train (TT/QTT) - Matrix Product States (MPS), Matrix Product Operator (MPO), DMRG, Strong Kronecker Product (SKP).

[1]  Nikos D. Sidiropoulos,et al.  Robust iterative fitting of multilinear models , 2005, IEEE Transactions on Signal Processing.

[2]  Frank Verstraete,et al.  Matrix product state representations , 2006, Quantum Inf. Comput..

[3]  Bart Vandereycken,et al.  The geometry of algorithms using hierarchical tensors , 2013, Linear Algebra and its Applications.

[4]  Daniel Kressner,et al.  Algorithm 941 , 2014 .

[5]  Richard A. Harshman,et al.  Foundations of the PARAFAC procedure: Models and conditions for an "explanatory" multi-model factor analysis , 1970 .

[6]  Andrzej Cichocki,et al.  Tensor Decompositions for Signal Processing Applications: From two-way to multiway component analysis , 2014, IEEE Signal Processing Magazine.

[7]  Lieven De Lathauwer,et al.  Blind Separation of Exponential Polynomials and the Decomposition of a Tensor in Rank-(Lr, Lr, 1) Terms , 2011, SIAM J. Matrix Anal. Appl..

[8]  Rasmus Bro,et al.  Multi-way Analysis with Applications in the Chemical Sciences , 2004 .

[9]  Pierre Comon,et al.  Handbook of Blind Source Separation: Independent Component Analysis and Applications , 2010 .

[10]  Andrzej Cichocki,et al.  Common and Individual Features Analysis: Beyond Canonical Correlation Analysis , 2012, ArXiv.

[11]  Joos Vandewalle,et al.  On the Best Rank-1 and Rank-(R1 , R2, ... , RN) Approximation of Higher-Order Tensors , 2000, SIAM J. Matrix Anal. Appl..

[12]  J. Ballani,et al.  Black box approximation of tensors in hierarchical Tucker format , 2013 .

[13]  Ivan V. Oseledets,et al.  Approximation of 2d˟2d Matrices Using Tensor Decomposition , 2010, SIAM J. Matrix Anal. Appl..

[14]  David F. Gleich,et al.  Scalable Methods for Nonnegative Matrix Factorizations of Near-separable Tall-and-skinny Matrices , 2014, NIPS.

[15]  S. Goreinov,et al.  A Theory of Pseudoskeleton Approximations , 1997 .

[16]  Vladimir A. Kazeev,et al.  Direct Solution of the Chemical Master Equation Using Quantized Tensor Trains , 2014, PLoS Comput. Biol..

[17]  Vince D. Calhoun,et al.  A review of group ICA for fMRI data and ICA for joint inference of imaging, genetic, and ERP data , 2009, NeuroImage.

[18]  Lars Grasedyck,et al.  Hierarchical Singular Value Decomposition of Tensors , 2010, SIAM J. Matrix Anal. Appl..

[19]  P. Kroonenberg Applied Multiway Data Analysis , 2008 .

[20]  VandewalleJoos,et al.  On the Best Rank-1 and Rank-(R1,R2,. . .,RN) Approximation of Higher-Order Tensors , 2000 .

[21]  Andrzej Cichocki,et al.  Canonical Polyadic Decomposition Based on a Single Mode Blind Source Separation , 2012, IEEE Signal Processing Letters.

[22]  Lieven De Lathauwer,et al.  Decompositions of a Higher-Order Tensor in Block Terms - Part III: Alternating Least Squares Algorithms , 2008, SIAM J. Matrix Anal. Appl..

[23]  Jennifer Seberry,et al.  The Strong Kronecker Product , 1994, J. Comb. Theory, Ser. A.

[24]  Roman Orus,et al.  A Practical Introduction to Tensor Networks: Matrix Product States and Projected Entangled Pair States , 2013, 1306.2164.

[25]  J. Chang,et al.  Analysis of individual differences in multidimensional scaling via an n-way generalization of “Eckart-Young” decomposition , 1970 .

[26]  Morten Mørup,et al.  Applications of tensor (multiway array) factorizations and decompositions in data mining , 2011, WIREs Data Mining Knowl. Discov..

[27]  Tamara G. Kolda,et al.  Tensor Decompositions and Applications , 2009, SIAM Rev..

[28]  Christine Tobler,et al.  Low-rank tensor methods for linear systems and eigenvalue problems , 2012 .

[29]  Renato Pajarola,et al.  TAMRESH – Tensor Approximation Multiresolution Hierarchy for Interactive Volume Visualization , 2013, Comput. Graph. Forum.

[30]  Reinhold Schneider,et al.  The Alternating Linear Scheme for Tensor Optimization in the Tensor Train Format , 2012, SIAM J. Sci. Comput..

[31]  Lieven De Lathauwer,et al.  Optimization-Based Algorithms for Tensor Decompositions: Canonical Polyadic Decomposition, Decomposition in Rank-(Lr, Lr, 1) Terms, and a New Generalization , 2013, SIAM J. Optim..

[32]  Naotaka Fujii,et al.  Higher Order Partial Least Squares (HOPLS): A Generalized Multilinear Regression Method , 2012, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[33]  Demetri Terzopoulos,et al.  Multilinear Analysis of Image Ensembles: TensorFaces , 2002, ECCV.

[34]  L. Tucker,et al.  Some mathematical notes on three-mode factor analysis , 1966, Psychometrika.

[35]  James Demmel,et al.  Direct QR factorizations for tall-and-skinny matrices in MapReduce architectures , 2013, 2013 IEEE International Conference on Big Data.

[36]  Naotaka Fujii,et al.  Higher Order Partial Least Squares (HOPLS): A Generalized Multilinear Regression Method , 2013, IEEE Trans. Pattern Anal. Mach. Intell..

[37]  Wolfgang Hackbusch,et al.  An Introduction to Hierarchical (H-) Rank and TT-Rank of Tensors with Examples , 2011, Comput. Methods Appl. Math..

[38]  L. Lathauwer,et al.  Canonical Polyadic Decomposition with Orthogonality Constraints , 2012 .

[39]  Daniel Kressner,et al.  A literature survey of low‐rank tensor approximation techniques , 2013, 1302.7121.

[40]  De LathauwerLieven Blind Separation of Exponential Polynomials and the Decomposition of a Tensor in Rank-$(L_r,L_r,1)$ Terms , 2011 .

[41]  A. Cichocki 脳機能計測と生体信号入出力(第7回)Tensor Decompositions: New Concepts in Brain Data Analysis? , 2011 .

[42]  P. Comon,et al.  Tensor decompositions, alternating least squares and other tales , 2009 .

[43]  Yukihiko Yamashita,et al.  Linked PARAFAC/CP Tensor Decomposition and Its Fast Implementation for Multi-block Tensor Analysis , 2012, ICONIP.

[44]  Joos Vandewalle,et al.  A Multilinear Singular Value Decomposition , 2000, SIAM J. Matrix Anal. Appl..

[45]  F. L. Hitchcock Multiple Invariants and Generalized Rank of a P‐Way Matrix or Tensor , 1928 .

[46]  Andrzej Cichocki,et al.  From basis components to complex structural patterns , 2013, 2013 IEEE International Conference on Acoustics, Speech and Signal Processing.

[47]  Vladimir A. Kazeev,et al.  Multilevel Toeplitz Matrices Generated by Tensor-Structured Vectors and Convolution with Logarithmic Complexity , 2013, SIAM J. Sci. Comput..

[48]  A. Cichocki,et al.  Tensor decompositions for feature extraction and classification of high dimensional datasets , 2010 .

[49]  Andrzej Cichocki,et al.  Fast and unique Tucker decompositions via multiway blind source separation , 2012 .

[50]  Andrzej Cichocki,et al.  On Revealing Replicating Structures in Multiway Data: A Novel Tensor Decomposition Approach , 2012, LVA/ICA.

[51]  Charles F. Loan,et al.  Structured tensor computations: blocking, symmetries and kronecker factorizations , 2012 .

[52]  N. Ahuja,et al.  Out-of-core tensor approximation of multi-dimensional matrices of visual data , 2005, SIGGRAPH 2005.

[53]  W. Hackbusch Tensor Spaces and Numerical Tensor Calculus , 2012, Springer Series in Computational Mathematics.

[54]  Andrzej Cichocki,et al.  PARAFAC algorithms for large-scale problems , 2011, Neurocomputing.

[55]  E. Tyrtyshnikov,et al.  TT-cross approximation for multidimensional arrays , 2010 .

[56]  A. Cichocki Generalized Component Analysis and Blind Source Separation Methods for Analyzing Multichannel Brain Signals , 2006 .

[57]  Tamara G. Kolda,et al.  Scalable Tensor Factorizations for Incomplete Data , 2010, ArXiv.

[58]  Haiping Lu,et al.  A survey of multilinear subspace learning for tensor data , 2011, Pattern Recognit..

[59]  Petros Drineas,et al.  Tensor-CUR Decompositions for Tensor-Based Data , 2008, SIAM J. Matrix Anal. Appl..

[60]  André Lima Férrer de Almeida,et al.  Overview of constrained PARAFAC models , 2014, EURASIP Journal on Advances in Signal Processing.

[61]  L. Lathauwer,et al.  On the Best Rank-1 and Rank-( , 2004 .

[62]  S. Goreinov,et al.  Pseudo-skeleton approximations by matrices of maximal volume , 1997 .

[63]  Daniel Kressner,et al.  Low-Rank Tensor Methods with Subspace Correction for Symmetric Eigenvalue Problems , 2014, SIAM J. Sci. Comput..

[64]  U. Schollwöck Matrix Product State Algorithms: DMRG, TEBD and Relatives , 2013 .

[65]  Boris N. Khoromskij,et al.  Two-Level QTT-Tucker Format for Optimized Tensor Calculus , 2013, SIAM J. Matrix Anal. Appl..

[66]  Andrzej Cichocki,et al.  Fast Local Algorithms for Large Scale Nonnegative Matrix and Tensor Factorizations , 2009, IEICE Trans. Fundam. Electron. Commun. Comput. Sci..

[67]  A. Cichocki,et al.  Generalizing the column–row matrix decomposition to multi-way arrays , 2010 .

[68]  Eugene E. Tyrtyshnikov,et al.  Breaking the Curse of Dimensionality, Or How to Use SVD in Many Dimensions , 2009, SIAM J. Sci. Comput..

[69]  Andrzej Cichocki,et al.  Fundamental Tensor Operations for Large-Scale Data Analysis in Tensor Train Formats , 2014, ArXiv.

[70]  Boris N. Khoromskij,et al.  Computation of extreme eigenvalues in higher dimensions using block tensor train format , 2013, Comput. Phys. Commun..

[71]  Lieven De Lathauwer,et al.  Decompositions of a Higher-Order Tensor in Block Terms - Part I: Lemmas for Partitioned Matrices , 2008, SIAM J. Matrix Anal. Appl..

[72]  Garnet Kin-Lic Chan,et al.  Efficient tree tensor network states (TTNS) for quantum chemistry: generalizations of the density matrix renormalization group algorithm. , 2013, The Journal of chemical physics.

[73]  J. Ross Beveridge,et al.  Action classification on product manifolds , 2010, 2010 IEEE Computer Society Conference on Computer Vision and Pattern Recognition.

[74]  Andrzej Cichocki,et al.  Low Complexity Damped Gauss-Newton Algorithms for CANDECOMP/PARAFAC , 2012, SIAM J. Matrix Anal. Appl..

[75]  Eugene E. Tyrtyshnikov,et al.  Algebraic Wavelet Transform via Quantics Tensor Train Decomposition , 2011, SIAM J. Sci. Comput..

[76]  T. Schulte-Herbrüggen,et al.  Computations in quantum tensor networks , 2012, 1212.5005.

[77]  Petros Drineas,et al.  CUR matrix decompositions for improved data analysis , 2009, Proceedings of the National Academy of Sciences.

[78]  Visa Koivunen,et al.  Sequential Unfolding SVD for Tensors With Applications in Array Signal Processing , 2009, IEEE Transactions on Signal Processing.

[79]  N. Sidiropoulos,et al.  On the uniqueness of multilinear decomposition of N‐way arrays , 2000 .

[80]  Ivan Oseledets,et al.  Tensor-Train Decomposition , 2011, SIAM J. Sci. Comput..

[81]  G. Vidal Efficient classical simulation of slightly entangled quantum computations. , 2003, Physical review letters.

[82]  B. Recht,et al.  Tensor completion and low-n-rank tensor recovery via convex optimization , 2011 .

[83]  VLADIMIR A. KAZEEV,et al.  Low-Rank Explicit QTT Representation of the Laplace Operator and Its Inverse , 2012, SIAM J. Matrix Anal. Appl..

[84]  U. Schollwoeck The density-matrix renormalization group in the age of matrix product states , 2010, 1008.3477.

[85]  B. Khoromskij O(dlog N)-Quantics Approximation of N-d Tensors in High-Dimensional Numerical Modeling , 2011 .

[86]  Andrzej Cichocki,et al.  Nonnegative Matrix and Tensor Factorization T , 2007 .

[87]  B. Khoromskij Tensors-structured Numerical Methods in Scientific Computing: Survey on Recent Advances , 2012 .

[88]  Christoph Schwab,et al.  Low-rank tensor structure of linear diffusion operators in the TT and QTT formats☆ , 2013 .

[89]  André Uschmajew,et al.  On Local Convergence of Alternating Schemes for Optimization of Convex Problems in the Tensor Train Format , 2013, SIAM J. Numer. Anal..

[90]  Andrzej Cichocki,et al.  Fast Nonnegative Matrix/Tensor Factorization Based on Low-Rank Approximation , 2012, IEEE Transactions on Signal Processing.

[91]  F. Verstraete,et al.  Matrix product states, projected entangled pair states, and variational renormalization group methods for quantum spin systems , 2008, 0907.2796.

[92]  S. V. Dolgov,et al.  ALTERNATING MINIMAL ENERGY METHODS FOR LINEAR SYSTEMS IN HIGHER DIMENSIONS∗ , 2014 .