Shortcut Matrix Product States and its applications

Matrix Product States (MPS), also known as Tensor Train (TT) decomposition in mathematics, has been proposed originally for describing an (especially one-dimensional) quantum system, and recently has found applications in various applications such as compressing high-dimensional data, supervised kernel linear classifier, and unsupervised generative modeling. However, when applied to systems which are not defined on one-dimensional lattices, a serious drawback of the MPS is the exponential decay of the correlations, which limits its power in capturing long-range dependences among variables in the system. To alleviate this problem, we propose to introduce long-range interactions, which act as shortcuts, to MPS, resulting in a new model \textit{ Shortcut Matrix Product States} (SMPS). When chosen properly, the shortcuts can decrease significantly the correlation length of the MPS, while preserving the computational efficiency. We develop efficient training methods of SMPS for various tasks, establish some of their mathematical properties, and show how to find a good location to add shortcuts. Finally, using extensive numerical experiments we evaluate its performance in a variety of applications, including function fitting, partition function calculation of $2-$d Ising model, and unsupervised generative modeling of handwritten digits, to illustrate its advantages over vanilla matrix product states.

[1]  Boris N. Khoromskij,et al.  A fast iteration method for solving elliptic problems with quasiperiodic coefficients , 2015, 1510.00284.

[2]  Alexander Novikov,et al.  Tensorizing Neural Networks , 2015, NIPS.

[3]  Haidong Xie,et al.  Generalized Lanczos method for systematic optimization of tensor network states , 2016, Chinese Physics B.

[4]  G. Evenbly,et al.  Tensor Network Renormalization. , 2014, Physical review letters.

[5]  Jun Wang,et al.  Unsupervised Generative Modeling Using Matrix Product States , 2017, Physical Review X.

[6]  F. Verstraete,et al.  Computational complexity of projected entangled pair states. , 2007, Physical review letters.

[7]  Ivan Oseledets,et al.  Tensor-Train Decomposition , 2011, SIAM J. Sci. Comput..

[8]  Andrzej Cichocki,et al.  Tensor Decompositions for Signal Processing Applications: From two-way to multiway component analysis , 2014, IEEE Signal Processing Magazine.

[9]  T. Schulte-Herbrüggen,et al.  Computations in quantum tensor networks , 2012, 1212.5005.

[10]  Liqing Zhang,et al.  Tensor Ring Decomposition , 2016, ArXiv.

[11]  Huaiyu Zhu On Information and Sufficiency , 1997 .

[12]  T. Xiang,et al.  Accurate determination of tensor network state of quantum lattice models in two dimensions. , 2008, Physical review letters.

[13]  Z. Y. Xie,et al.  Coarse-graining renormalization by higher-order singular value decomposition , 2012, 1201.1144.

[14]  Yu-An Chen,et al.  Density matrix renormalization group , 2014 .

[15]  Roman Orus,et al.  A Practical Introduction to Tensor Networks: Matrix Product States and Projected Entangled Pair States , 2013, 1306.2164.

[16]  Andrzej Cichocki,et al.  Tensor Networks for Dimensionality Reduction, Big Data and Deep Learning , 2018, Advances in Data Analysis with Computational Intelligence Methods.

[17]  Andrew J. Ferris,et al.  Perfect Sampling with Unitary Tensor Networks , 2012, 1201.3974.

[18]  David J. Schwab,et al.  Supervised Learning with Tensor Networks , 2016, NIPS.

[19]  Ronen Tamari,et al.  Analysis and Design of Convolutional Networks via Hierarchical Tensor Decompositions , 2017, ArXiv.

[20]  Tamara G. Kolda,et al.  Tensor Decompositions and Applications , 2009, SIAM Rev..

[21]  B. Khoromskij Tensor numerical methods for multidimensional PDES: theoretical analysis and initial applications , 2015 .

[22]  L. Onsager Crystal statistics. I. A two-dimensional model with an order-disorder transition , 1944 .

[23]  Ángel J. Gallego,et al.  The physical structure of grammatical correlations: equivalences, formalizations and consequences , 2017, ArXiv.

[24]  White,et al.  Density matrix formulation for quantum renormalization groups. , 1992, Physical review letters.

[25]  Yiannis Vlassopoulos,et al.  Tensor network language model , 2017, ArXiv.

[26]  Michael Levin,et al.  Tensor renormalization group approach to two-dimensional classical lattice models. , 2006, Physical review letters.