Machine learning for condensed matter physics

Condensed Matter Physics (CMP) seeks to understand the microscopic interactions of matter at the quantum and atomistic levels, and describes how these interactions result in both mesoscopic and macroscopic properties. CMP overlaps with many other important branches of science, such as Chemistry, Materials Science, Statistical Physics, and High-Performance Computing. With the advancements in modern Machine Learning (ML) technology, a keen interest in applying these algorithms to further CMP research has created a compelling new area of research at the intersection of both fields. In this review, we aim to explore the main areas within CMP, which have successfully applied ML techniques to further research, such as the description and use of ML schemes for potential energy surfaces, the characterization of topological phases of matter in lattice systems, the prediction of phase transitions in off-lattice and atomistic simulations, the interpretation of ML theories with physics-inspired frameworks and the enhancement of simulation methods with ML algorithms. We also discuss the main challenges and outlooks for future developments.

[1]  Stefan Wessel,et al.  Parameter diagnostics of phases and phase transition learning by neural networks , 2018, Physical Review B.

[2]  Bram van Ginneken,et al.  A survey on deep learning in medical image analysis , 2017, Medical Image Anal..

[3]  Miao‐kun Sun,et al.  Trends in cognitive sciences , 2012 .

[4]  G. Jackson,et al.  The liquid-crystalline phase behaviour of hard spherocylinders with terminal point dipoles , 1996 .

[5]  Yuki Nagai,et al.  Self-learning Monte Carlo method with Behler-Parrinello neural networks , 2018, Physical Review B.

[6]  Nello Cristianini,et al.  Kernel Methods for Pattern Analysis , 2003, ICTAI.

[7]  Simona Cocco,et al.  Learning protein constitutive motifs from sequence data , 2018, eLife.

[8]  Haruki Nakamura,et al.  The worldwide Protein Data Bank (wwPDB): ensuring a single, uniform archive of PDB data , 2006, Nucleic Acids Res..

[9]  R. Zecchina,et al.  Inverse statistical problems: from the inverse Ising problem to data science , 2017, 1702.01522.

[10]  Brian L. DeCost,et al.  Elucidating multi-physics interactions in suspensions for the design of polymeric dispersants: a hierarchical machine learning approach , 2017 .

[11]  J. Ashkin,et al.  Two Problems in the Statistical Mechanics of Crystals. I. The Propagation of Order in Crystal Lattices. I. The Statistics of Two-Dimensional Lattices with Four Components. , 1943 .

[12]  N. Wagner,et al.  Dynamical arrest, percolation, gelation, and glass formation in model nanoparticle dispersions with thermoreversible adhesive interactions. , 2012, Langmuir : the ACS journal of surfaces and colloids.

[13]  Yang Qi,et al.  Self-learning Monte Carlo method , 2016, 1610.03137.

[14]  Alfredo Vellido,et al.  Neural networks in business: a survey of applications (1992–1998) , 1999 .

[15]  Michael Walters,et al.  Machine learning topological defects of confined liquid crystals in two dimensions. , 2019, Physical review. E.

[16]  Hong-Ye Hu,et al.  Machine learning holographic mapping by neural network renormalization group , 2019, Physical Review Research.

[17]  Ivor W. Tsang,et al.  Core Vector Machines: Fast SVM Training on Very Large Data Sets , 2005, J. Mach. Learn. Res..

[18]  S. Chandrasekhar Liquid Crystals: Cholesteric liquid crystals , 1992 .

[19]  B. Alder,et al.  Phase Transition for a Hard Sphere System , 1957 .

[20]  K. Binder,et al.  A Guide to Monte Carlo Simulations in Statistical Physics , 2000 .

[21]  S. Manzhos,et al.  Machine learning for the solution of the Schrödinger equation , 2020, Mach. Learn. Sci. Technol..

[22]  Tomi Ohtsuki,et al.  Drawing Phase Diagrams of Random Quantum Systems by Deep Learning the Wave Functions , 2019 .

[23]  Wayne A Hendrickson,et al.  What is 'current opinion' in structural biology? , 2011, Current opinion in structural biology.

[24]  William G. Hoover,et al.  Melting Transition and Communal Entropy for Hard Spheres , 1968 .

[25]  L. Onsager Crystal statistics. I. A two-dimensional model with an order-disorder transition , 1944 .

[26]  Taghi M. Khoshgoftaar,et al.  A survey of transfer learning , 2016, Journal of Big Data.

[27]  Shulin Wang,et al.  Feature selection in machine learning: A new perspective , 2018, Neurocomputing.

[28]  Érica R. Filletti,et al.  Artificial neural networks for density-functional optimizations in fermionic systems , 2018, Scientific Reports.

[29]  Vedika Khemani,et al.  Machine Learning Out-of-Equilibrium Phases of Matter. , 2017, Physical review letters.

[30]  Daria B Kokh,et al.  Modeling and simulation of protein–surface interactions: achievements and challenges , 2016, Quarterly Reviews of Biophysics.

[31]  Melissa C. Smith,et al.  A generalized deep learning approach for local structure identification in molecular simulations , 2019, Chemical science.

[32]  Xin Xu,et al.  Kernel-Based Least Squares Policy Iteration for Reinforcement Learning , 2007, IEEE Transactions on Neural Networks.

[33]  Klaus-Robert Müller,et al.  Finding Density Functionals with Machine Learning , 2011, Physical review letters.

[34]  Andrés Santos,et al.  Note: equation of state and the freezing point in the hard-sphere model. , 2014, The Journal of chemical physics.

[35]  Li Huang,et al.  Accelerated Monte Carlo simulations with restricted Boltzmann machines , 2016, 1610.02746.

[36]  Karl Pearson F.R.S. LIII. On lines and planes of closest fit to systems of points in space , 1901 .

[37]  J. Dobnikar,et al.  Strain-induced domain formation in two-dimensional colloidal systems , 2006 .

[38]  Risi Kondor,et al.  Publisher’s Note: On representing chemical environments [Phys. Rev. B 87 , 184115 (2013)] , 2013 .

[39]  David J. Schwab,et al.  An exact mapping between the Variational Renormalization Group and Deep Learning , 2014, ArXiv.

[40]  R. Melko,et al.  TOPICAL REVIEW: Monte Carlo studies of the dipolar spin ice model , 2004 .

[41]  Gülgün Kayakutlu,et al.  Definition of artificial neural networks with comparison to other networks , 2011, WCIT.

[42]  Erik Cambria,et al.  Recent Trends in Deep Learning Based Natural Language Processing , 2017, IEEE Comput. Intell. Mag..

[43]  E. Ising Beitrag zur Theorie des Ferromagnetismus , 1925 .

[44]  Michael I. Jordan,et al.  On Spectral Clustering: Analysis and an algorithm , 2001, NIPS.

[45]  Troels Arnfred Bojesen,et al.  Policy-guided Monte Carlo: Reinforcement-learning Markov chain dynamics , 2018, Physical Review E.

[46]  Roman M. Balabin,et al.  Support vector machine regression (LS-SVM)--an alternative to artificial neural networks (ANNs) for the analysis of quantum chemistry data? , 2011, Physical chemistry chemical physics : PCCP.

[47]  Vijayan K. Asari,et al.  The History Began from AlexNet: A Comprehensive Survey on Deep Learning Approaches , 2018, ArXiv.

[48]  K-R Müller,et al.  SchNet - A deep learning architecture for molecules and materials. , 2017, The Journal of chemical physics.

[49]  K. Wilson Renormalization Group and Critical Phenomena. I. Renormalization Group and the Kadanoff Scaling Picture , 1971 .

[50]  J. Banavar,et al.  Computer Simulation of Liquids , 1988 .

[51]  J. Behler Perspective: Machine learning potentials for atomistic simulations. , 2016, The Journal of chemical physics.

[52]  Giancarlo Fissore,et al.  Thermodynamics of Restricted Boltzmann Machines and Related Learning Dynamics , 2018, Journal of Statistical Physics.

[53]  Pradeep Kumar The theory of critical phenomena: An introduction to the renormalization group. By J. J. Binney, N. J. Dowrick, A. J. Fisher, and M. E. J. Newman, Clarendon Press, Oxford, 1992. 464 pp. , 1993 .

[54]  Roger G. Melko,et al.  Machine learning phases of matter , 2016, Nature Physics.

[55]  Jörg Behler,et al.  Neural network molecular dynamics simulations of solid-liquid interfaces: water at low-index copper surfaces. , 2016, Physical chemistry chemical physics : PCCP.

[56]  N. D. Mermin,et al.  The topological theory of defects in ordered media , 1979 .

[57]  Timothy J. Sluckin,et al.  Crystals that flow: classic papers from the history of liquid crystals , 2004 .

[58]  Luís Torgo,et al.  OpenML: networked science in machine learning , 2014, SKDD.

[59]  Liang Fu,et al.  Self-learning Monte Carlo with deep neural networks , 2018, Physical Review B.

[60]  Takamichi Terao,et al.  A machine learning approach to analyze the structural formation of soft matter via image recognition , 2020, Soft Materials.

[61]  Felix A Faber,et al.  Crystal structure representations for machine learning models of formation energies , 2015, 1503.07406.

[62]  Jasper Snoek,et al.  Nonparametric guidance of autoencoder representations using label information , 2012, J. Mach. Learn. Res..

[63]  Chris Yakopcic,et al.  A State-of-the-Art Survey on Deep Learning Theory and Architectures , 2019, Electronics.

[64]  Hava T. Siegelmann,et al.  Support Vector Clustering , 2002, J. Mach. Learn. Res..

[65]  Zohar Ringel,et al.  Optimal Renormalization Group Transformation from Information Theory , 2018, Physical Review X.

[66]  Michael Griebel,et al.  A representer theorem for deep kernel learning , 2019, J. Mach. Learn. Res..

[67]  Stephen Wu,et al.  Machine-learning-assisted discovery of polymers with high thermal conductivity using a molecular design algorithm , 2019, npj Computational Materials.

[68]  Vladimir Vapnik,et al.  An overview of statistical learning theory , 1999, IEEE Trans. Neural Networks.

[69]  Emanuele Boattini,et al.  Unsupervised learning for local structure detection in colloidal systems. , 2019, The Journal of chemical physics.

[70]  H. Lekkerkerker,et al.  Insights into phase transition kinetics from colloid science , 2002, Nature.

[71]  Charu C. Aggarwal,et al.  Linear Algebra and Optimization for Machine Learning: A Textbook , 2020 .

[72]  J. Reif,et al.  DNA-Templated Self-Assembly of Protein Arrays and Highly Conductive Nanowires , 2003, Science.

[73]  F. Armstrong,et al.  Current opinion in chemical biology. , 2012, Current opinion in chemical biology.

[74]  Pramod P. Khargonekar,et al.  Fast SVM training using approximate extreme points , 2013, J. Mach. Learn. Res..

[75]  C. Giannetti,et al.  Machine Learning as a universal tool for quantitative investigations of phase transitions , 2018, Nuclear Physics B.

[76]  R. French Catastrophic forgetting in connectionist networks , 1999, Trends in Cognitive Sciences.

[77]  Kristof T. Schütt,et al.  How to represent crystal structures for machine learning: Towards fast prediction of electronic properties , 2013, 1307.1266.

[78]  Wolfram Koch,et al.  A Chemist's Guide to Density Functional Theory , 2000 .

[79]  M. Marques,et al.  Recent advances and applications of machine learning in solid-state materials science , 2019, npj Computational Materials.

[80]  Giancarlo Fissore,et al.  Spectral dynamics of learning in restricted Boltzmann machines , 2017 .

[81]  Chuang,et al.  Coarsening dynamics in uniaxial nematic liquid crystals. , 1993, Physical review. E, Statistical physics, plasmas, fluids, and related interdisciplinary topics.

[82]  Geoffrey E. Hinton,et al.  Visualizing Data using t-SNE , 2008 .

[83]  James L. McClelland,et al.  Why there are complementary learning systems in the hippocampus and neocortex: insights from the successes and failures of connectionist models of learning and memory. , 1995, Psychological review.

[84]  Thomas Gärtner,et al.  Graph kernels and Gaussian processes for relational reinforcement learning , 2006, Machine Learning.

[85]  Yi Zhang,et al.  Machine learning Z 2 quantum spin liquids with quasiparticle statistics , 2017, 1705.01947.

[86]  Björn Ommer,et al.  Deep unsupervised learning of visual similarities , 2018, Pattern Recognit..

[87]  Senlin Luo,et al.  Deep supervised learning with mixture of neural networks , 2020, Artif. Intell. Medicine.

[88]  Christoph Dellago,et al.  Neural networks for local structure detection in polymorphic systems. , 2013, The Journal of chemical physics.

[89]  Yoram Singer,et al.  Pegasos: primal estimated sub-gradient solver for SVM , 2011, Math. Program..

[90]  E. Lindahl,et al.  Membrane proteins: molecular dynamics simulations. , 2008, Current opinion in structural biology.

[91]  L. Verlet Computer "Experiments" on Classical Fluids. I. Thermodynamical Properties of Lennard-Jones Molecules , 1967 .

[92]  Ausif Mahmood,et al.  Review of Deep Learning Algorithms and Architectures , 2019, IEEE Access.

[93]  Lei Wang,et al.  Discovering phase transitions with unsupervised learning , 2016, 1606.00318.

[94]  Martín Carpio,et al.  A novel formulation of orthogonal polynomial kernel functions for SVM classifiers: The Gegenbauer family , 2018, Pattern Recognit..

[95]  Le Song,et al.  On the Complexity of Learning Neural Networks , 2017, NIPS.

[96]  Yoshua Bengio,et al.  Gradient-based learning applied to document recognition , 1998, Proc. IEEE.

[97]  Qisong Xu,et al.  Machine Learning for Polymer Swelling in Liquids , 2020 .

[98]  Pengfei Zhang,et al.  Machine Learning Topological Invariants with Neural Networks , 2017, Physical review letters.

[99]  H. Stanley,et al.  Dependence of critical properties on dimensionality of spins , 1968 .

[100]  Jürgen Schmidhuber,et al.  LSTM: A Search Space Odyssey , 2015, IEEE Transactions on Neural Networks and Learning Systems.

[101]  Simone Severini,et al.  Learning hard quantum distributions with variational autoencoders , 2017, npj Quantum Information.

[102]  C. Holm,et al.  ESPResSo 4.0 – an extensible software package for simulating soft matter systems , 2018, The European Physical Journal Special Topics.

[103]  Akinori Tanaka,et al.  Deep learning and the AdS/CFT correspondence , 2018, Physical Review D.

[104]  T. C. Lubensky Soft condensed matter physics , 1997 .

[105]  Michael A Webb,et al.  Graph-Based Approach to Systematic Molecular Coarse-Graining. , 2019, Journal of chemical theory and computation.

[106]  Alfonso Rojas-Domínguez,et al.  Optimal Hyper-Parameter Tuning of SVM Classifiers With Application to Medical Diagnosis , 2018, IEEE Access.

[107]  D. Mehler,et al.  Correction: Open science challenges, benefits and tips in early career and beyond , 2019, PLoS biology.

[108]  J. Carrasquilla Machine learning for quantum matter , 2020, 2003.11040.

[109]  Kyle Mills,et al.  Deep learning and the Schrödinger equation , 2017, ArXiv.

[110]  M. Siegel,et al.  Robustness of the Berezinskii-Kosterlitz-Thouless transition in ultrathin NbN films near the superconductor-insulator transition , 2013, 1304.5419.

[111]  Volker Roth,et al.  The generalized LASSO , 2004, IEEE Transactions on Neural Networks.

[112]  Andersen,et al.  Scaling behavior in the beta -relaxation regime of a supercooled Lennard-Jones mixture. , 1994, Physical review letters.

[113]  Simona Cocco,et al.  Learning Compositional Representations of Interacting Systems with Restricted Boltzmann Machines: Comparative Study of Lattice Proteins , 2019, Neural Computation.

[114]  Christoph Dellago,et al.  Accurate determination of crystal structures based on averaged local bond order parameters. , 2008, The Journal of chemical physics.

[115]  Mohammad M. Sultan,et al.  Automated design of collective variables using supervised machine learning. , 2018, The Journal of chemical physics.

[116]  Demis Hassabis,et al.  Improved protein structure prediction using potentials from deep learning , 2020, Nature.

[117]  Roger G. Melko,et al.  Kernel methods for interpretable machine learning of order parameters , 2017, 1704.05848.

[118]  G. Jackson,et al.  The effect of dipolar interactions on the liquid crystalline phase transitions of hard spherocylinders with central longitudinal dipoles , 1998 .

[119]  Carl E. Rasmussen,et al.  Gaussian processes for machine learning , 2005, Adaptive computation and machine learning.

[120]  Thomas Blaschke,et al.  Application of Generative Autoencoder in De Novo Molecular Design , 2017, Molecular informatics.

[121]  Marcus Liwicki,et al.  DeepDIVA: A Highly-Functional Python Framework for Reproducible Experiments , 2018, 2018 16th International Conference on Frontiers in Handwriting Recognition (ICFHR).

[122]  Lutz Frommberger Qualitative spatial abstraction in reinforcement learning , 2010 .

[123]  Kenny Choo,et al.  Two-dimensional frustrated J1−J2 model studied with neural network quantum states , 2019, Physical Review B.

[124]  B Seoane,et al.  Equilibrium fluid-solid coexistence of hard spheres. , 2012, Physical review letters.

[125]  Michele Ceriotti,et al.  Unsupervised machine learning in atomistic simulations, between predictions and understanding. , 2019, The Journal of chemical physics.

[126]  L. Reven,et al.  A dynamic view of self-assembled monolayers. , 2000, Accounts of chemical research.

[127]  Guigang Zhang,et al.  Deep Learning , 2016, Int. J. Semantic Comput..

[128]  Roi Livni,et al.  On the Computational Efficiency of Training Neural Networks , 2014, NIPS.

[129]  Lei Wang,et al.  Neural Network Renormalization Group , 2018, Physical review letters.

[130]  Heiko Hoffmann,et al.  Kernel PCA for novelty detection , 2007, Pattern Recognit..

[131]  Nitish Srivastava,et al.  Dropout: a simple way to prevent neural networks from overfitting , 2014, J. Mach. Learn. Res..

[132]  Levent Sagun,et al.  The jamming transition as a paradigm to understand the loss landscape of deep neural networks , 2018, Physical review. E.

[133]  K. Müller,et al.  Fast and accurate modeling of molecular atomization energies with machine learning. , 2011, Physical review letters.

[134]  Lan Bai,et al.  Clustering by twin support vector machine and least square twin support vector classifier with uniform output coding , 2019, Knowl. Based Syst..

[135]  Xiao Yan Xu,et al.  Symmetry-enforced self-learning Monte Carlo method applied to the Holstein model , 2018, Physical Review B.

[136]  P. Pieri,et al.  Self-learning projective quantum Monte Carlo simulations guided by restricted Boltzmann machines. , 2019, Physical review. E.

[137]  Andrew S. Darmawan,et al.  Restricted Boltzmann machine learning for solving strongly correlated quantum systems , 2017, 1709.06475.

[138]  David J. Schwab,et al.  A high-bias, low-variance introduction to Machine Learning for physicists , 2018, Physics reports.

[139]  Pavlo O. Dral,et al.  Quantum chemistry structures and properties of 134 kilo molecules , 2014, Scientific Data.

[140]  Christian Trott,et al.  Spectral neighbor analysis method for automated generation of quantum-accurate interatomic potentials , 2014, J. Comput. Phys..

[141]  Bin Li,et al.  Applications of machine learning in drug discovery and development , 2019, Nature Reviews Drug Discovery.

[142]  Naftali Tishby,et al.  Machine learning and the physical sciences , 2019, Reviews of Modern Physics.

[143]  Matthias Troyer,et al.  Solving the quantum many-body problem with artificial neural networks , 2016, Science.

[144]  Joseph Gomes,et al.  MoleculeNet: a benchmark for molecular machine learning† †Electronic supplementary information (ESI) available. See DOI: 10.1039/c7sc02664a , 2017, Chemical science.

[145]  D. Carvalho,et al.  Real-space mapping of topological invariants using artificial neural networks , 2018, 1801.09655.

[146]  Joaquin F. Rodriguez-Nieva,et al.  Identifying topological order through unsupervised machine learning , 2018, Nature Physics.

[147]  Marko Bohanec,et al.  Explaining machine learning models in sales predictions , 2017, Expert Syst. Appl..

[148]  D. Mehler,et al.  Open science challenges, benefits and tips in early career and beyond , 2018, PLoS biology.

[149]  P. Anderson Basic Notions of Condensed Matter Physics , 1983 .

[150]  Zhengdong Cheng,et al.  Phase diagram of hard spheres , 2001 .

[151]  Wei-Chang Yeh,et al.  Forecasting stock markets using wavelet transforms and recurrent neural networks: An integrated system based on artificial bee colony algorithm , 2011, Appl. Soft Comput..

[152]  Kurt Hornik,et al.  Multilayer feedforward networks are universal approximators , 1989, Neural Networks.

[153]  James L. McClelland,et al.  What Learning Systems do Intelligent Agents Need? Complementary Learning Systems Theory Updated , 2016, Trends in Cognitive Sciences.

[154]  Peter Nightingale Finite‐size scaling and phenomenological renormalization (invited) , 1982 .

[155]  Angshul Majumdar,et al.  AutoImpute: Autoencoder based imputation of single-cell RNA-seq data , 2018, Scientific Reports.

[156]  F. Y. Wu The Potts model , 1982 .

[157]  T. Morawietz,et al.  How van der Waals interactions determine the unique properties of water , 2016, Proceedings of the National Academy of Sciences.

[158]  F ROSENBLATT,et al.  The perceptron: a probabilistic model for information storage and organization in the brain. , 1958, Psychological review.

[159]  Samy Bengio,et al.  Understanding deep learning requires rethinking generalization , 2016, ICLR.

[160]  Donald C. Wunsch,et al.  A Survey of Adaptive Resonance Theory Neural Network Models for Engineering Applications , 2019, Neural Networks.

[161]  Jürgen Schmidhuber,et al.  Long Short-Term Memory , 1997, Neural Computation.

[162]  Jeffrey C Grossman,et al.  Crystal Graph Convolutional Neural Networks for an Accurate and Interpretable Prediction of Material Properties. , 2017, Physical review letters.

[163]  Hythem Sidky,et al.  Learning free energy landscapes using artificial neural networks. , 2017, The Journal of chemical physics.

[164]  Daniel R. Reid,et al.  SSAGES: Software Suite for Advanced General Ensemble Simulations. , 2018, The Journal of chemical physics.

[165]  Peter Wittek,et al.  Adversarial Domain Adaptation for Identifying Phase Transitions , 2017, ArXiv.

[166]  Ehsan Sadrfaridpour,et al.  Engineering fast multilevel support vector machines , 2019, Machine Learning.

[167]  Andrea J. Liu,et al.  A structural approach to relaxation in glassy liquids , 2015, Nature Physics.

[168]  Roger G. Melko,et al.  Super-resolving the Ising model with convolutional neural networks , 2018, Physical Review B.

[169]  Zohar Ringel,et al.  Mutual information, neural networks and the renormalization group , 2017, ArXiv.

[170]  Xianli Pan,et al.  A Novel Twin Support-Vector Machine With Pinball Loss , 2017, IEEE Transactions on Neural Networks and Learning Systems.

[171]  Dong-Ling Deng,et al.  Machine Learning Topological States , 2016, 1609.09060.

[172]  Yang Qi,et al.  Self-learning Monte Carlo method: Continuous-time algorithm , 2017, 1705.06724.

[173]  O. Tomi,et al.  Drawing Phase Diagrams of Random Quantum Systems by Deep Learning the Wave Functions , 2020 .

[174]  Geoffrey E. Hinton,et al.  Deep Learning , 2015, Nature.

[175]  R. Castañeda-Priego,et al.  Phase behavior of colloids and proteins in aqueous suspensions: theory and computer simulations. , 2012, The Journal of chemical physics.

[176]  M. Hutson Artificial intelligence faces reproducibility crisis. , 2018, Science.

[177]  Bo Tian,et al.  Power Electronic Modules , 2018 .

[178]  G. Jackson,et al.  Chain and ring structures in smectic phases of molecules with transverse dipoles , 1997 .

[179]  Hichem Sahbi,et al.  Deep representation design from deep kernel networks , 2019, Pattern Recognit..

[180]  Sergio Gomez Colmenarejo,et al.  Hybrid computing using a neural network with dynamic external memory , 2016, Nature.

[181]  D. Thouless,et al.  Ordering, metastability and phase transitions in two-dimensional systems , 1973 .

[182]  Journal of Chemical Physics , 1932, Nature.

[183]  B. A. Lindquist,et al.  Unsupervised machine learning for detection of phase transitions in off-lattice systems. I. Foundations. , 2018, The Journal of chemical physics.

[184]  Bagchi,et al.  Computer simulation study of the melting transition in two dimensions. , 1996, Physical review letters.

[185]  Ah-Hwee Tan,et al.  Integrating Temporal Difference Methods and Self-Organizing Neural Networks for Reinforcement Learning With Delayed Evaluative Feedback , 2008, IEEE Transactions on Neural Networks.

[186]  Ling Shao,et al.  Dense Invariant Feature-Based Support Vector Ranking for Cross-Camera Person Reidentification , 2018, IEEE Transactions on Circuits and Systems for Video Technology.

[187]  Geoffrey E. Hinton Training Products of Experts by Minimizing Contrastive Divergence , 2002, Neural Computation.

[188]  W. Kohn An essay on condensed matter physics in the twentieth century , 1999 .

[189]  Li Li,et al.  Understanding Machine-learned Density Functionals , 2014, ArXiv.

[190]  Paul Smolensky,et al.  Information processing in dynamical systems: foundations of harmony theory , 1986 .

[191]  Eric N Minor,et al.  End-to-end machine learning for experimental physics: using simulated data to train a neural network for object detection in video microscopy. , 2019, Soft matter.

[192]  P. Steinhardt,et al.  Bond-orientational order in liquids and glasses , 1983 .

[193]  R. Melko,et al.  Machine Learning Phases of Strongly Correlated Fermions , 2016, Physical Review X.

[194]  Ge Wang,et al.  On Interpretability of Artificial Neural Networks , 2020, ArXiv.

[195]  Julio López,et al.  Alternative second-order cone programming formulations for support vector classification , 2014, Inf. Sci..

[196]  Razvan Pascanu,et al.  Overcoming catastrophic forgetting in neural networks , 2016, Proceedings of the National Academy of Sciences.

[197]  J. Nelson,et al.  Suppression of the Berezinskii-Kosterlitz-Thouless transition in 2D superconductors by macroscopic quantum tunneling. , 2012, Physical review letters.

[198]  C H Mak Large-scale simulations of the two-dimensional melting of hard disks. , 2006, Physical review. E, Statistical, nonlinear, and soft matter physics.

[199]  Yann LeCun,et al.  Regularization of Neural Networks using DropConnect , 2013, ICML.

[200]  Anil K. Jain,et al.  Artificial Neural Networks: A Tutorial , 1996, Computer.

[201]  Geoffrey E. Hinton,et al.  Reducing the Dimensionality of Data with Neural Networks , 2006, Science.

[202]  Roger G. Melko,et al.  Machine learning vortices at the Kosterlitz-Thouless transition , 2017, 1710.09842.

[203]  Max Tegmark,et al.  Why Does Deep and Cheap Learning Work So Well? , 2016, Journal of Statistical Physics.

[204]  Michael McCloskey,et al.  Catastrophic Interference in Connectionist Networks: The Sequential Learning Problem , 1989 .

[205]  Nikolaos Doulamis,et al.  Deep Learning for Computer Vision: A Brief Review , 2018, Comput. Intell. Neurosci..

[206]  R. Kondor,et al.  Gaussian approximation potentials: the accuracy of quantum mechanics, without the electrons. , 2009, Physical review letters.

[207]  J. Cirac,et al.  Restricted Boltzmann machines in quantum physics , 2019, Nature Physics.

[208]  Tomi Ohtsuki,et al.  Deep Learning the Quantum Phase Transitions in Random Two-Dimensional Electron Systems , 2016, 1610.00462.

[209]  Honglak Lee,et al.  An Analysis of Single-Layer Networks in Unsupervised Feature Learning , 2011, AISTATS.

[210]  Shiliang Sun,et al.  Multi-Kernel Online Reinforcement Learning for Path Tracking Control of Intelligent Vehicles , 2021, IEEE Transactions on Systems, Man, and Cybernetics: Systems.

[211]  Yang Zhang,et al.  Data Imputation Using Least Squares Support Vector Machines in Urban Arterial Streets , 2009, IEEE Signal Processing Letters.

[212]  H. Bourlard,et al.  Auto-association by multilayer perceptrons and singular value decomposition , 1988, Biological Cybernetics.

[213]  Stephen Jesse,et al.  Machine learning–enabled identification of material phase transitions based on experimental data: Exploring collective dynamics in ferroelectric relaxors , 2018, Science Advances.

[214]  Jasper Snoek,et al.  Practical Bayesian Optimization of Machine Learning Algorithms , 2012, NIPS.

[215]  Yoshua Bengio,et al.  Classification using discriminative restricted Boltzmann machines , 2008, ICML '08.

[216]  Alan M. Ferrenberg,et al.  Critical behavior of the three-dimensional Ising model: A high-resolution Monte Carlo study. , 1991, Physical review. B, Condensed matter.

[217]  Bernhard Schölkopf,et al.  A tutorial on support vector regression , 2004, Stat. Comput..

[218]  Haiping Huang,et al.  Advanced Mean Field Theory of Restricted Boltzmann Machine , 2015, Physical review. E, Statistical, nonlinear, and soft matter physics.

[219]  Wonmuk Hwang,et al.  Design of nanostructured biological materials through self-assembly of peptides and proteins. , 2002, Current opinion in chemical biology.

[220]  M. Fisher,et al.  Three-state Potts model and anomalous tricritical points , 1973 .

[221]  Yong-Sheng Zhang,et al.  Solving frustrated quantum many-particle models with convolutional neural networks , 2018, Physical Review B.

[222]  Weili Zeng,et al.  Recurrent Neural Networks With External Addressable Long-Term and Working Memory for Learning Long-Term Dependences , 2020, IEEE Transactions on Neural Networks and Learning Systems.

[223]  J. Cirac,et al.  Neural-Network Quantum States, String-Bond States, and Chiral Topological States , 2017, 1710.04045.

[224]  Michele Parrinello,et al.  Generalized neural-network representation of high-dimensional potential-energy surfaces. , 2007, Physical review letters.

[225]  John B. Kogut,et al.  An introduction to lattice gauge theory and spin systems , 1979 .

[226]  Wei Jiang,et al.  Unsupervised fault diagnosis of rolling bearings using a deep neural network based on generative adversarial networks , 2018, Neurocomputing.

[227]  Shifei Ding,et al.  Discrete space reinforcement learning algorithm based on support vector machine classification , 2018, Pattern Recognit. Lett..

[228]  Steven D. Brown,et al.  Neural network models of potential energy surfaces , 1995 .

[229]  Long Zhang,et al.  Dynamical detection of topological charges , 2018, Physical Review A.

[230]  Hwee Kuan Lee,et al.  Machine-Learning Studies on Spin Models , 2020, Scientific Reports.

[231]  Wojciech Samek,et al.  Methods for interpreting and understanding deep neural networks , 2017, Digit. Signal Process..

[232]  Dongkyu Kim,et al.  Smallest neural network to learn the Ising criticality. , 2018, Physical review. E.

[233]  TesauroGerald Practical Issues in Temporal Difference Learning , 1992 .

[234]  Ian W. Hamley,et al.  Introduction to soft matter: synthetic and biological self-assembling materials. Revised edition , 2007 .

[235]  Martin Hilbert,et al.  The World’s Technological Capacity to Store, Communicate, and Compute Information , 2011, Science.

[236]  T. Ohtsuki,et al.  Deep Learning the Quantum Phase Transitions in Random Electron Systems: Applications to Three Dimensions , 2016, 1612.04909.

[237]  A. Tanaka,et al.  Detection of phase transition via convolutional neural network , 2016, 1609.09087.

[238]  A. Fisher,et al.  The Theory of Critical Phenomena: An Introduction to the Renormalization Group , 1992 .

[239]  Chih-Jen Lin,et al.  LIBSVM: A library for support vector machines , 2011, TIST.

[240]  Steve Plimpton,et al.  Fast parallel algorithms for short-range molecular dynamics , 1993 .

[241]  Ali Farhadi,et al.  YOLO9000: Better, Faster, Stronger , 2016, 2017 IEEE Conference on Computer Vision and Pattern Recognition (CVPR).

[242]  Juan Carrasquilla,et al.  Machine learning quantum phases of matter beyond the fermion sign problem , 2016, Scientific Reports.

[243]  Sebastian Johann Wetzel,et al.  Unsupervised learning of phase transitions: from principal component analysis to variational autoencoders , 2017, Physical review. E.

[244]  Shane Legg,et al.  Human-level control through deep reinforcement learning , 2015, Nature.

[245]  Ying Xie,et al.  Deep Embedding Kernel , 2018, Neurocomputing.

[246]  N. Wagner,et al.  Dynamical arrest transition in nanoparticle dispersions with short-range interactions. , 2011, Physical review letters.

[247]  K. Wilson Renormalization Group and Critical Phenomena. II. Phase-Space Cell Analysis of Critical Behavior , 1971 .

[248]  N. Aluru,et al.  Molecular Dynamics Properties without the Full Trajectory: A Denoising Autoencoder Network for Properties of Simple Liquids. , 2019, The journal of physical chemistry letters.

[249]  Hong Chen,et al.  Universal approximation to nonlinear operators by neural networks with arbitrary activation functions and its application to dynamical systems , 1995, IEEE Trans. Neural Networks.

[250]  Tanaka Akinori,et al.  Detection of Phase Transition via Convolutional Neural Networks , 2016, 1609.09087.

[251]  Derek Hoiem,et al.  Learning without Forgetting , 2016, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[252]  Geoffrey E. Hinton A Practical Guide to Training Restricted Boltzmann Machines , 2012, Neural Networks: Tricks of the Trade.

[253]  Razvan Pascanu,et al.  Revisiting Natural Gradient for Deep Networks , 2013, ICLR.

[254]  Chih-Jen Lin,et al.  LIBLINEAR: A Library for Large Linear Classification , 2008, J. Mach. Learn. Res..

[255]  J. Kosterlitz,et al.  The critical properties of the two-dimensional xy model , 1974 .

[256]  Saeid Nahavandi,et al.  Deep Reinforcement Learning for Multiagent Systems: A Review of Challenges, Solutions, and Applications , 2018, IEEE Transactions on Cybernetics.

[257]  Alán Aspuru-Guzik,et al.  Inverse molecular design using machine learning: Generative models for matter engineering , 2018, Science.

[258]  Yang Qi,et al.  Self-learning Monte Carlo method and cumulative update in fermion systems , 2017 .