Statistical Predictions in String Theory and Deep Generative Models

Generative models in deep learning allow for sampling probability distributions that approximate data distributions. We propose using generative models for making approximate statistical predictions in the string theory landscape. For vacua admitting a Lagrangian description this can be thought of as learning random tensor approximations of couplings. As a concrete proof‐of‐principle, we demonstrate in a large ensemble of Calabi‐Yau manifolds that Kähler metrics evaluated at points in Kähler moduli space are well‐approximated by ensembles of matrices produced by a deep convolutional Wasserstein GAN. Accurate approximations of the Kähler metric eigenspectra are achieved with far fewer than h11 Gaussian draws. Accurate extrapolation to values of h11 outside the training set are achieved via a conditional GAN. Together, these results implicitly suggest the existence of strong correlations in the data, as might be expected if Reid's fantasy is correct.

[1]  G. G. Stokes "J." , 1890, The New Yale Book of Quotations.

[2]  Weinberg,et al.  Anthropic bound on the cosmological constant. , 1987, Physical review letters.

[3]  Michael I. Jordan,et al.  Advances in Neural Information Processing Systems 30 , 1995 .

[4]  Quantization of four-form fluxes and dynamical neutralization of the cosmological constant , 2000, hep-th/0004134.

[5]  Maximilian Kreuzer,et al.  Complete classification of reflexive polyhedra in four dimensions , 2000, hep-th/0002240.

[6]  The statistics of string/M theory vacua , 2003, hep-th/0303194.

[7]  F. Denef,et al.  Distributions of flux vacua , 2004, hep-th/0404116.

[8]  F. Denef,et al.  Distributions of nonsupersymmetric flux vacua , 2004, hep-th/0411183.

[9]  Michael R. Douglas,et al.  Computational complexity of the landscape I , 2006, ArXiv.

[10]  Eternal inflation: the inside story , 2006, hep-th/0606114.

[11]  A. Vilenkin,et al.  Probabilities in the inflationary multiverse , 2005, hep-th/0509184.

[12]  A paradox in the global description of the multiverse , 2006, hep-th/0610132.

[13]  C. Villani Optimal Transport: Old and New , 2008 .

[14]  R. Bousso,et al.  Properties of the scale factor measure , 2008, 0808.3770.

[15]  A. Simone,et al.  Boltzmann brains and the scale-factor cutoff measure of the multiverse , 2008, 0808.3778.

[16]  C. Stivers Class , 2010 .

[17]  M. Cvetič,et al.  On the computation of non‐perturbative effective potentials in the string theory landscape – IIB/F‐theory perspective – , 2010, 1009.5386.

[18]  Ben Freivogel Making predictions in the multiverse , 2011, 1105.0244.

[19]  W. Marsden I and J , 2012 .

[20]  G. Shiu,et al.  A global view on the search for de Sitter vacua in (Type IIA) string theory , 2011, 1112.3338.

[21]  D. Marsh,et al.  Supersymmetric vacua in random supergravity , 2012, 1207.2763.

[22]  D. Marsh,et al.  The wasteland of random supergravities , 2011, 1112.3034.

[23]  Thomas C. Bachlechner On Gaussian random supergravity , 2014, 1401.6187.

[24]  Max Welling,et al.  Auto-Encoding Variational Bayes , 2013, ICLR.

[25]  L. McAllister,et al.  Heavy tails in Calabi-Yau moduli spaces , 2014, 1407.0709.

[26]  A. Westphal,et al.  The scale of inflation in the landscape , 2013, 1303.3224.

[27]  Luigi Acerbi,et al.  Advances in Neural Information Processing Systems 27 , 2014 .

[28]  Shakir Mohamed,et al.  Variational Inference with Normalizing Flows , 2015, ICML.

[29]  New class of de Sitter vacua in string theory compactifications , 2015, 1510.01273.

[30]  Soumith Chintala,et al.  Unsupervised Representation Learning with Deep Convolutional Generative Adversarial Networks , 2015, ICLR.

[31]  A. Westphal,et al.  Inflation with a graceful exit in a random landscape , 2016, 1611.07059.

[32]  Algorithmic universality in F-theory compactifications , 2017, 1706.02299.

[33]  Dmitri Krioukov,et al.  Machine learning in the string landscape , 2017, Journal of High Energy Physics.

[34]  F. Denef,et al.  Computational complexity of the landscape II - Cosmological considerations , 2017, 1706.06430.

[35]  Junyu Liu,et al.  Artificial neural network in cosmic landscape , 2017, 1707.02800.

[36]  D. Krefl,et al.  Machine Learning of Calabi-Yau Volumes : arXiv , 2017, 1706.03346.

[37]  Fabian Ruehle,et al.  String Theory and the Dark Glueball Problem , 2016, 1609.02151.

[38]  W. Taylor,et al.  Scanning the skeleton of the 4D F-theory landscape , 2017, 1710.11235.

[39]  Fabian Ruehle Evolving neural networks with genetic algorithms to study the string landscape , 2017, 1706.07024.

[40]  Vishnu Jejjala,et al.  Machine learning CICY threefolds , 2018, Physics Letters B.

[41]  Yi-Nan Wang,et al.  Learning non-Higgsable gauge groups in 4D F-theory , 2018, Journal of High Energy Physics.

[42]  Akinori Tanaka,et al.  Deep learning and the AdS/CFT correspondence , 2018, Physical Review D.

[43]  Mario Lucic,et al.  Are GANs Created Equal? A Large-Scale Study , 2017, NeurIPS.

[44]  Liam McAllister,et al.  The Kreuzer-Skarke axiverse , 2018, Journal of High Energy Physics.

[45]  B. Nelson,et al.  Vacuum Selection from Cosmology on Networks of String Geometries. , 2017, Physical review letters.

[46]  Fabian Ruehle,et al.  Branes with brains: exploring string vacua with deep reinforcement learning , 2019, Journal of High Energy Physics.

[47]  Alex Cole,et al.  Searching the landscape of flux vacua with genetic algorithms , 2019, Journal of High Energy Physics.

[48]  G. Shiu,et al.  Topological data analysis for the string landscape , 2018, Journal of High Energy Physics.

[49]  Onkar Parrikar,et al.  Search optimization, funnel topography, and dynamical criticality on the string landscape , 2019, Journal of Cosmology and Astroparticle Physics.

[50]  K. Hashimoto AdS/CFT correspondence as a deep Boltzmann machine , 2019, Physical Review D.

[51]  Yang-Hui He,et al.  Getting CICY high , 2019, Physics Letters B.

[52]  W. Hager,et al.  and s , 2019, Shallow Water Hydraulics.

[53]  A. Mutter,et al.  Deep learning in the heterotic orbifold landscape , 2018, Nuclear Physics B.

[54]  A. Constantin,et al.  Formulae for Line Bundle Cohomology on Calabi‐Yau Threefolds , 2018, Fortschritte der Physik.

[55]  Adv , 2019, International Journal of Pediatrics and Adolescent Medicine.

[56]  Tom Rudelius Learning to inflate. A gradient ascent approach to random inflation , 2018, Journal of Cosmology and Astroparticle Physics.

[57]  Yang-Hui He,et al.  Distinguishing elliptic fibrations with AI , 2019, Physics Letters B.

[58]  B. Nelson,et al.  Estimating Calabi-Yau hypersurface and triangulation counts with equation learners , 2018, Journal of High Energy Physics.

[59]  Vishnu Jejjala,et al.  Deep learning the hyperbolic volume of a knot , 2019, Physics Letters B.

[60]  Lorenz Schlechter,et al.  Machine learning line bundle cohomologies of hypersurfaces in toric varieties , 2018, Physics Letters B.

[61]  Tsuyoshi Murata,et al.  {m , 1934, ACML.

[62]  Fabian Ruehle,et al.  Kähler moduli stabilization and the propagation of decidability , 2019, Physical Review D.

[63]  Gershon Wolansky,et al.  Optimal Transport , 2021 .

[64]  P. Alam ‘O’ , 2021, Composites Engineering: An A–Z Guide.

[65]  P. Alam ‘S’ , 2021, Composites Engineering: An A–Z Guide.