Tensor networks and efficient descriptions of classical data

[1]  Alex Graves,et al.  Conditional Image Generation with PixelCNN Decoders , 2016, NIPS.

[2]  S. C. Bariloche,et al.  Entanglement entropy in free quantum field theory , 2009, 0905.2562.

[3]  J. I. Cirac,et al.  Variational study of hard-core bosons in a two-dimensional optical lattice using projected entangled pair states , 2007 .

[4]  W. Marsden I and J , 2012 .

[5]  Lu-Ming Duan,et al.  Machine learning meets quantum physics , 2019, Physics Today.

[6]  Geoffrey E. Hinton,et al.  ImageNet classification with deep convolutional neural networks , 2012, Commun. ACM.

[7]  D. Deng,et al.  Quantum Entanglement in Neural Network States , 2017, 1701.04844.

[8]  Hugo Larochelle,et al.  Neural Autoregressive Distribution Estimation , 2016, J. Mach. Learn. Res..

[9]  Amnon Shashua,et al.  Deep Learning and Quantum Entanglement: Fundamental Connections with Implications to Network Design , 2017, ICLR.

[10]  Matthias Troyer,et al.  Mutual information in classical spin models , 2010, Journal of Statistical Mechanics: Theory and Experiment.

[11]  Amnon Shashua,et al.  Quantum Entanglement in Deep Learning Architectures. , 2018, Physical review letters.

[12]  Yoon Kim,et al.  Convolutional Neural Networks for Sentence Classification , 2014, EMNLP.

[13]  Mohammad Norouzi,et al.  Stacks of convolutional Restricted Boltzmann Machines for shift-invariant feature learning , 2009, 2009 IEEE Conference on Computer Vision and Pattern Recognition.

[14]  J. Cardy,et al.  Entanglement entropy and conformal field theory , 2009, 0905.4013.

[15]  Jeffrey Pennington,et al.  GloVe: Global Vectors for Word Representation , 2014, EMNLP.

[16]  U. Schollwoeck The density-matrix renormalization group in the age of matrix product states , 2010, 1008.3477.

[17]  Philippe Corboz,et al.  Variational optimization with infinite projected entangled-pair states , 2016, 1605.03006.

[18]  E. Miles Stoudenmire,et al.  Learning relevant features of data with multi-scale tensor networks , 2017, ArXiv.

[19]  Nitish Srivastava,et al.  Dropout: a simple way to prevent neural networks from overfitting , 2014, J. Mach. Learn. Res..

[20]  Norbert Schuch,et al.  Entropy scaling and simulability by matrix product states. , 2007, Physical review letters.

[21]  Daan Wierstra,et al.  Deep AutoRegressive Networks , 2013, ICML.

[22]  Surya Ganguli,et al.  Statistical Mechanics of Deep Learning , 2020, Annual Review of Condensed Matter Physics.

[23]  Jing Chen,et al.  Information Perspective to Probabilistic Modeling: Boltzmann Machines versus Born Machines , 2017, Entropy.

[24]  P. Corboz,et al.  Period 4 stripe in the extended two-dimensional Hubbard model , 2019, Physical Review B.

[25]  Jimmy Ba,et al.  Adam: A Method for Stochastic Optimization , 2014, ICLR.

[26]  Jürgen Schmidhuber,et al.  Long Short-Term Memory , 1997, Neural Computation.

[27]  F. Verstraete,et al.  Matrix product states represent ground states faithfully , 2005, cond-mat/0505140.

[28]  M. B. Hastings,et al.  Solving gapped Hamiltonians locally , 2006 .

[29]  J. Eisert,et al.  Colloquium: Area laws for the entanglement entropy , 2010 .

[30]  G. Vidal Class of quantum many-body states that can be efficiently simulated. , 2006, Physical review letters.

[31]  J. Ignacio Cirac,et al.  From Probabilistic Graphical Models to Generalized Tensor Networks for Supervised Learning , 2018, IEEE Access.

[32]  Frank Verstraete,et al.  Entanglement rates and area laws. , 2013, Physical review letters.

[33]  Alexander Novikov,et al.  Tensorizing Neural Networks , 2015, NIPS.

[34]  A. Kraskov,et al.  Estimating mutual information. , 2003, Physical review. E, Statistical, nonlinear, and soft matter physics.

[35]  A. Lauchli,et al.  Finite Correlation Length Scaling in Lorentz-Invariant Gapless iPEPS Wave Functions , 2018, Physical Review X.

[36]  Michael M. Wolf,et al.  On entropy growth and the hardness of simulating time evolution , 2008, 0801.2078.

[37]  Xi Chen,et al.  PixelCNN++: Improving the PixelCNN with Discretized Logistic Mixture Likelihood and Other Modifications , 2017, ICLR.

[38]  Karl Stratos,et al.  Formal Limitations on the Measurement of Mutual Information , 2018, AISTATS.

[39]  Elina Robeva,et al.  Duality of Graphical Models and Tensor Networks , 2017, Information and Inference: A Journal of the IMA.

[40]  David J. Schwab,et al.  Supervised Learning with Tensor Networks , 2016, NIPS.

[41]  Max Tegmark,et al.  Why Does Deep and Cheap Learning Work So Well? , 2016, Journal of Statistical Physics.

[42]  Yoshua Bengio,et al.  Generative Adversarial Nets , 2014, NIPS.

[43]  Ivan Oseledets,et al.  Tensor-Train Decomposition , 2011, SIAM J. Sci. Comput..

[44]  Ya-Hui Zhang,et al.  Entanglement Entropy of Target Functions for Image Classification and Convolutional Neural Network , 2017, ArXiv.

[45]  Heiga Zen,et al.  WaveNet: A Generative Model for Raw Audio , 2016, SSW.

[46]  J. Cardy Scaling and Renormalization in Statistical Physics , 1996 .

[47]  Michael Tschannen,et al.  On Mutual Information Maximization for Representation Learning , 2019, ICLR.

[48]  Matthias Troyer,et al.  Three-sublattice order in the SU(3) Heisenberg model on the square and triangular lattice , 2011, 1112.1100.

[49]  Yoshua Bengio,et al.  Learning deep representations by mutual information estimation and maximization , 2018, ICLR.

[50]  J. Cardy,et al.  Entanglement entropy and quantum field theory , 2004, hep-th/0405152.

[51]  Lu-Ming Duan,et al.  Efficient representation of quantum many-body states with deep neural networks , 2017, Nature Communications.

[52]  Pascal Vincent,et al.  Representation Learning: A Review and New Perspectives , 2012, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[53]  Tao Xiang,et al.  Compressing deep neural networks by matrix product operators , 2019, Physical Review Research.

[54]  J. Ignacio Cirac,et al.  Unifying projected entangled pair state contractions , 2013, 1311.6696.

[55]  F. Verstraete,et al.  Matrix product states and projected entangled pair states: Concepts, symmetries, theorems , 2020, Reviews of Modern Physics.

[56]  Emily J. Davis,et al.  Calculating Rényi entropies with neural autoregressive quantum states , 2020, Physical Review A.

[57]  Liam Paninski,et al.  Estimation of Entropy and Mutual Information , 2003, Neural Computation.

[58]  M. Troyer,et al.  Stripes in the two-dimensional t-J model with infinite projected entangled-pair states , 2011, 1104.5463.

[59]  F. Wilczek,et al.  Geometric and renormalized entropy in conformal field theory , 1994, hep-th/9403108.

[60]  David J. Schwab,et al.  An exact mapping between the Variational Renormalization Group and Deep Learning , 2014, ArXiv.

[61]  Yoshua Bengio,et al.  Mutual Information Neural Estimation , 2018, ICML.

[62]  Patrick C.G. Vlaar,et al.  Simulation of three-dimensional quantum systems with projected entangled-pair states , 2021, Physical Review B.

[63]  D. Tracy,et al.  A 11 a , 2016 .

[64]  Guigang Zhang,et al.  Deep Learning , 2016, Int. J. Semantic Comput..

[65]  Lukasz Kaiser,et al.  Attention is All you Need , 2017, NIPS.

[66]  Lei Wang,et al.  Tree Tensor Networks for Generative Modeling , 2019, Physical Review B.

[67]  Frank Verstraete,et al.  Entanglement Rates and the Stability of the Area Law for the Entanglement Entropy , 2014, 1411.0680.

[68]  J. Ignacio Cirac,et al.  Approximating Gibbs states of local Hamiltonians efficiently with projected entangled pair states , 2014, 1406.2973.

[69]  M. Srednicki,et al.  Entropy and area. , 1993, Physical review letters.

[70]  Jens Eisert,et al.  Expressive power of tensor-network factorizations for probabilistic modeling, with applications from hidden Markov models to quantum machine learning , 2019, NeurIPS.

[71]  S. Varadhan,et al.  Asymptotic evaluation of certain Markov process expectations for large time , 1975 .

[72]  Nadav Cohen,et al.  On the Expressive Power of Deep Learning: A Tensor Analysis , 2015, COLT 2016.

[73]  M. Hastings,et al.  An area law for one-dimensional quantum systems , 2007, 0705.2024.

[74]  Jun Wang,et al.  Unsupervised Generative Modeling Using Matrix Product States , 2017, Physical Review X.

[75]  Nicholas Carrara,et al.  On the Estimation of Mutual Information , 2020 .

[76]  Matthew B Hastings,et al.  Area laws in quantum systems: mutual information and correlations. , 2007, Physical review letters.

[77]  F. Verstraete,et al.  Computational complexity of projected entangled pair states. , 2007, Physical review letters.

[78]  G. Vidal,et al.  Classical simulation of quantum many-body systems with a tree tensor network , 2005, quant-ph/0511070.

[79]  Naftali Tishby,et al.  Machine learning and the physical sciences , 2019, Reviews of Modern Physics.

[80]  G. Vidal,et al.  Entanglement in quantum critical phenomena. , 2002, Physical review letters.

[81]  Peter E. Latham,et al.  Mutual Information , 2006 .

[82]  Huitao Shen,et al.  Mutual Information Scaling and Expressive Power of Sequence Models , 2019, ArXiv.

[83]  P. Grassberger Entropy Estimates from Insufficient Samplings , 2003, physics/0307138.

[84]  Max Tegmark,et al.  Critical Behavior in Physics and Probabilistic Formal Languages , 2016, Entropy.

[85]  Antonio Torralba,et al.  Ieee Transactions on Pattern Analysis and Machine Intelligence 1 80 Million Tiny Images: a Large Dataset for Non-parametric Object and Scene Recognition , 2022 .

[86]  Naftali Tishby,et al.  Deep learning and the information bottleneck principle , 2015, 2015 IEEE Information Theory Workshop (ITW).

[87]  Geoffrey E. Hinton,et al.  Generating Text with Recurrent Neural Networks , 2011, ICML.

[88]  Geoffrey E. Hinton,et al.  Rectified Linear Units Improve Restricted Boltzmann Machines , 2010, ICML.

[89]  F. Verstraete,et al.  Renormalization algorithms for Quantum-Many Body Systems in two and higher dimensions , 2004, cond-mat/0407066.

[90]  C. Bény Deep learning and the renormalization group , 2013, 1301.3124.