Machine learning phases of matter

The success of machine learning techniques in handling big data sets proves ideal for classifying condensed-matter phases and phase transitions. The technique is even amenable to detecting non-trivial states lacking in conventional order. Condensed-matter physics is the study of the collective behaviour of infinitely complex assemblies of electrons, nuclei, magnetic moments, atoms or qubits1. This complexity is reflected in the size of the state space, which grows exponentially with the number of particles, reminiscent of the ‘curse of dimensionality’ commonly encountered in machine learning2. Despite this curse, the machine learning community has developed techniques with remarkable abilities to recognize, classify, and characterize complex sets of data. Here, we show that modern machine learning architectures, such as fully connected and convolutional neural networks3, can identify phases and phase transitions in a variety of condensed-matter Hamiltonians. Readily programmable through modern software libraries4,5, neural networks can be trained to detect multiple types of order parameter, as well as highly non-trivial states with no conventional order, directly from raw state configurations sampled with Monte Carlo6,7.

[1]  Martín Abadi,et al.  TensorFlow: Large-Scale Machine Learning on Heterogeneous Distributed Systems , 2016, ArXiv.

[2]  Andrea J. Liu,et al.  A structural approach to relaxation in glassy liquids , 2015, Nature Physics.

[3]  Sergei V. Kalinin,et al.  Big-deep-smart data in imaging for guiding materials design. , 2015, Nature materials.

[4]  Adolfo Avella,et al.  Strongly Correlated Systems , 2012 .

[5]  J. Sólyom,et al.  Strongly correlated systems , 2010, Physics Subject Headings (PhySH).

[6]  Geoffrey E. Hinton,et al.  Deep Learning , 2015, Nature.

[7]  J. Vybíral,et al.  Big data of materials science: critical role of the descriptor. , 2014, Physical review letters.

[8]  D. Poulin,et al.  Practical learning method for multi-scale entangled states , 2012, 1204.0792.

[9]  Matthew P. A. Fisher,et al.  Quantum Field Theory of Many-Body Systems: From the Origin of Sound to an Origin of Light and Electrons , 2005 .

[10]  Manh Cuong Nguyen,et al.  On-the-fly machine-learning for high-throughput experiments: search for rare-earth-free permanent magnets , 2014, Scientific Reports.

[11]  John Preskill,et al.  Topological entanglement entropy. , 2005, Physical Review Letters.

[12]  Gordon F. Newell,et al.  CRYSTAL STATISTICS OF A TWO-DIMENSIONAL TRIANGULAR ISING LATTICE. , 1950 .

[13]  Xiao-Gang Wen,et al.  Detecting topological order in a ground state wave function. , 2005, Physical review letters.

[14]  C. Castelnovo,et al.  Entanglement and topological entropy of the toric code at finite temperature , 2007, 0704.3616.

[15]  Adolfo Avella,et al.  Strongly correlated systems : numerical methods , 2013 .

[16]  S. Huber,et al.  Learning phase transitions by confusion , 2016, Nature Physics.

[17]  O. A. von Lilienfeld,et al.  Machine learning for many-body physics: The case of the Anderson impurity model , 2014, 1408.1143.

[18]  John B. Kogut,et al.  An introduction to lattice gauge theory and spin systems , 1979 .

[19]  A. Kitaev Fault tolerant quantum computation by anyons , 1997, quant-ph/9707021.

[20]  P ? ? ? ? ? ? ? % ? ? ? ? , 1991 .

[21]  A. Sandvik Computational Studies of Quantum Spin Systems , 2010, 1101.3281.

[22]  R. Bellman Dynamic programming. , 1957, Science.

[23]  G. G. Stokes "J." , 1890, The New Yale Book of Quotations.

[24]  L. Onsager Crystal statistics. I. A two-dimensional model with an order-disorder transition , 1944 .

[25]  X. Wen Quantum Field Theory of Many-body Systems: From the Origin of Sound to an Origin of Light and Electrons , 2004 .

[26]  Yoshua Bengio,et al.  Gradient-based learning applied to document recognition , 1998, Proc. IEEE.

[27]  Claudio Castelnovo,et al.  Topological order and topological entropy in classical systems , 2006, cond-mat/0610316.