Asymptotically unbiased estimation of physical observables with neural samplers.

We propose a general framework for the estimation of observables with generative neural samplers focusing on modern deep generative neural networks that provide an exact sampling probability. In this framework, we present asymptotically unbiased estimators for generic observables, including those that explicitly depend on the partition function such as free energy or entropy, and derive corresponding variance estimators. We demonstrate their practical applicability by numerical experiments for the two-dimensional Ising model which highlight the superiority over existing methods. Our approach greatly enhances the applicability of generative neural samplers to real-world physical systems.

[1]  Ulli Wolff Monte Carlo errors with less errors , 2004 .

[2]  Hao Wu,et al.  Boltzmann generators: Sampling equilibrium states of many-body systems with deep learning , 2018, Science.

[3]  L. Pang,et al.  Regressive and generative neural networks for scalar field theory , 2018, Physical Review D.

[4]  Amnon Shashua,et al.  Deep autoregressive models for the efficient variational simulation of many-body quantum systems , 2019, Physical review letters.

[5]  Zohar Ringel,et al.  Mutual information, neural networks and the renormalization group , 2017, ArXiv.

[6]  Li Huang,et al.  Accelerated Monte Carlo simulations with restricted Boltzmann machines , 2016, 1610.02746.

[7]  Lei Wang,et al.  Neural Network Renormalization Group , 2018, Physical review letters.

[8]  C. Gattringer,et al.  Quantum Chromodynamics on the Lattice: An Introductory Presentation , 2009 .

[9]  Matthias Troyer,et al.  Solving the quantum many-body problem with artificial neural networks , 2016, Science.

[10]  Roger G. Melko,et al.  Learning Thermodynamics with Boltzmann Machines , 2016, ArXiv.

[11]  Lei Wang,et al.  Solving Statistical Mechanics using Variational Autoregressive Networks , 2018, Physical review letters.

[12]  Deborah Bard,et al.  CosmoGAN: creating high-fidelity weak lensing convergence maps using Generative Adversarial Networks , 2017, Computational Astrophysics and Cosmology.

[13]  Shinichi Nakajima,et al.  Variational Bayesian Learning Theory , 2019 .

[14]  M. Fisher,et al.  Bounded and Inhomogeneous Ising Models. I. Specific-Heat Anomaly of a Finite Lattice , 1969 .

[15]  C. D. Gelatt,et al.  Optimization by Simulated Annealing , 1983, Science.

[16]  M. S. Albergo,et al.  Flow-based generative models for Markov chain Monte Carlo in lattice field theory , 2019, Physical Review D.

[17]  U. Wolff Comparison Between Cluster Monte Carlo Algorithms in the Ising Model , 1989 .

[18]  L. Onsager Crystal statistics. I. A two-dimensional model with an order-disorder transition , 1944 .