SE(3)-Equivariant Graph Neural Networks for Data-Efficient and Accurate Interatomic Potentials

This work presents Neural Equivariant Interatomic Potentials (NequIP), a SE(3)-equivariant neural network approach for learning interatomic potentials from ab-initio calculations for molecular dynamics simulations. While most contemporary symmetry-aware models use invariant convolutions and only act on scalars, NequIP employs SE(3)-equivariant convolutions for interactions of geometric tensors, resulting in a more information-rich and faithful representation of atomic environments. The method achieves state-of-the-art accuracy on a challenging set of diverse molecules and materials while exhibiting remarkable data efficiency. NequIP outperforms existing models with up to three orders of magnitude fewer training data, challenging the widely held belief that deep neural networks require massive training sets. The high data efficiency of the method allows for the construction of accurate potentials using high-order quantum chemical level of theory as reference and enables high-fidelity molecular dynamics simulations over long time scales.

[1]  Ralf Drautz,et al.  Atomic cluster expansion for accurate and transferable interatomic potentials , 2019, Physical Review B.

[2]  R. Kondor,et al.  Gaussian approximation potentials: the accuracy of quantum mechanics, without the electrons. , 2009, Physical review letters.

[3]  Kiyoyuki Terakura,et al.  First Principles Molecular Dynamics Study of Ziegler−Natta Heterogeneous Catalysis , 1998 .

[4]  P. Alam ‘A’ , 2021, Composites Engineering: An A–Z Guide.

[5]  Christian Trott,et al.  Spectral neighbor analysis method for automated generation of quantum-accurate interatomic potentials , 2014, J. Comput. Phys..

[6]  Anders S Christensen,et al.  FCHL revisited: Faster and more accurate quantum machine learning. , 2020, The Journal of chemical physics.

[7]  J S Smith,et al.  ANI-1: an extensible neural network potential with DFT accuracy at force field computational cost , 2016, Chemical science.

[8]  W. Hager,et al.  and s , 2019, Shallow Water Hydraulics.

[9]  Alexandre Tkatchenko,et al.  Quantum-chemical insights from deep tensor neural networks , 2016, Nature Communications.

[10]  Burke,et al.  Generalized Gradient Approximation Made Simple. , 1996, Physical review letters.

[11]  P. Alam ‘L’ , 2021, Composites Engineering: An A–Z Guide.

[12]  O. Anatole von Lilienfeld,et al.  On the role of gradients for machine learning of molecular energies and forces , 2020, Mach. Learn. Sci. Technol..

[13]  R. Dror,et al.  How Fast-Folding Proteins Fold , 2011, Science.

[14]  Nongnuch Artrith,et al.  Understanding the composition and activity of electrocatalytic nanoalloys in aqueous solvents: a combination of DFT and accurate neural network potentials. , 2014, Nano letters.

[15]  Natalia Gimelshein,et al.  PyTorch: An Imperative Style, High-Performance Deep Learning Library , 2019, NeurIPS.

[16]  G. Kresse,et al.  From ultrasoft pseudopotentials to the projector augmented-wave method , 1999 .

[17]  W. Marsden I and J , 2012 .

[18]  W. Sim,et al.  Multiple bonding configurations of adsorbed formate on Ag{111} , 1996 .

[19]  Zhen Lin,et al.  Clebsch-Gordan Nets: a Fully Fourier Space Spherical Convolutional Neural Network , 2018, NeurIPS.

[20]  G. Jellison,et al.  A Stable Thin‐Film Lithium Electrolyte: Lithium Phosphorus Oxynitride , 1997 .

[21]  Martín Abadi,et al.  TensorFlow: Large-Scale Machine Learning on Heterogeneous Distributed Systems , 2016, ArXiv.

[22]  Jan Eric Lenssen,et al.  Fast Graph Representation Learning with PyTorch Geometric , 2019, ArXiv.

[23]  Mordechai Kornbluth,et al.  A fast neural network approach for direct covariant forces prediction in complex multi-element extended systems , 2019, Nature Machine Intelligence.

[24]  Klaus-Robert Müller,et al.  SpookyNet: Learning force fields with electronic degrees of freedom and nonlocal effects , 2021, Nature Communications.

[25]  Guichang Wang,et al.  Why is formate synthesis insensitive to copper surface structures? , 2006, The journal of physical chemistry. B.

[26]  E Weinan,et al.  End-to-end Symmetry Preserving Inter-atomic Potential Energy Model for Finite and Extended Systems , 2018, NeurIPS.

[27]  Edgar A. Engel,et al.  Ab initio thermodynamics of liquid and solid water , 2018, Proceedings of the National Academy of Sciences.

[28]  Anima Anandkumar,et al.  UNiTE: Unitary N-body Tensor Equivariant Network with Applications to Quantum Chemistry , 2021, ArXiv.

[29]  Extending the applicability of the ANI deep learning molecular potential to Sulfur and Halogens. , 2020, Journal of chemical theory and computation.

[30]  Kresse,et al.  Efficient iterative schemes for ab initio total-energy calculations using a plane-wave basis set. , 1996, Physical review. B, Condensed matter.

[31]  S. Ong,et al.  Design and synthesis of the superionic conductor Na10SnP2S12 , 2016, Nature Communications.

[32]  Jian Sun,et al.  Deep Residual Learning for Image Recognition , 2015, 2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR).

[33]  Frank Hutter,et al.  Decoupled Weight Decay Regularization , 2017, ICLR.

[34]  Kevin Gimpel,et al.  Gaussian Error Linear Units (GELUs) , 2016 .

[35]  Satoshi Watanabe,et al.  Study of Li atom diffusion in amorphous Li3PO4 with neural network potential. , 2017, The Journal of chemical physics.

[36]  G. Kresse,et al.  Efficiency of ab-initio total energy calculations for metals and semiconductors using a plane-wave basis set , 1996 .

[37]  Stephan Eismann,et al.  Geometric Prediction: Moving Beyond Scalars , 2020, ArXiv.

[38]  Mordechai Kornbluth,et al.  Accurate and scalable multi-element graph neural network force field and molecular dynamics with direct force architecture , 2020, 2007.14444.

[39]  Michael Gastegger,et al.  Equivariant message passing for the prediction of tensorial properties and molecular spectra , 2021, ICML.

[40]  E Weinan,et al.  Deep Potential Molecular Dynamics: a scalable model with the accuracy of quantum mechanics , 2017, Physical review letters.

[41]  Klaus-Robert Müller,et al.  Machine learning of accurate energy-conserving molecular force fields , 2016, Science Advances.

[42]  Cas van der Oord,et al.  Linear Atomic Cluster Expansion Force Fields for Organic Molecules: Beyond RMSE , 2021, Journal of chemical theory and computation.

[43]  Michele Parrinello,et al.  Generalized neural-network representation of high-dimensional potential-energy surfaces. , 2007, Physical review letters.

[44]  Aaas News,et al.  Book Reviews , 1893, Buffalo Medical and Surgical Journal.

[45]  Andrea Grisafi,et al.  Atomic-Scale Representation and Statistical Learning of Tensorial Properties , 2019, ACS Symposium Series.

[46]  Li Li,et al.  Tensor Field Networks: Rotation- and Translation-Equivariant Neural Networks for 3D Point Clouds , 2018, ArXiv.

[47]  Tsuyoshi Murata,et al.  {m , 1934, ACML.

[48]  Samuel S. Schoenholz,et al.  Neural Message Passing for Quantum Chemistry , 2017, ICML.

[49]  Florian Becker,et al.  GemNet: Universal Directional Graph Neural Networks for Molecules , 2021, NeurIPS.

[50]  Joost VandeVondele,et al.  cp2k: atomistic simulations of condensed matter systems , 2014 .

[51]  Johnnie Gray,et al.  opt\_einsum - A Python package for optimizing contraction order for einsum-like expressions , 2018, J. Open Source Softw..

[52]  Yang Yang,et al.  Deep Learning Scaling is Predictable, Empirically , 2017, ArXiv.

[53]  Jimmy Ba,et al.  Adam: A Method for Stochastic Optimization , 2014, ICLR.

[54]  W. E,et al.  Isotope effects in liquid water via deep potential molecular dynamics , 2019, Molecular Physics.

[55]  Risi Kondor,et al.  N-body Networks: a Covariant Hierarchical Neural Network Architecture for Learning Atomic Potentials , 2018, ArXiv.

[56]  G. Kresse,et al.  Ab initio molecular dynamics for liquid metals. , 1993 .

[57]  Mario Geiger,et al.  Finding Symmetry Breaking Order Parameters with Euclidean Neural Networks , 2020, ArXiv.

[58]  Jonathan Vandermause,et al.  On-the-fly active learning of interpretable Bayesian force fields for atomistic rare events , 2020, npj Computational Materials.

[59]  Stephan Günnemann,et al.  Directional Message Passing for Molecular Graphs , 2020, ICLR.

[60]  K. Müller,et al.  Towards exact molecular dynamics simulations with machine-learned force fields , 2018, Nature Communications.

[61]  P. Alam ‘W’ , 2021, Composites Engineering.

[62]  A. Westover,et al.  Plasma Synthesis of Spherical Crystalline and Amorphous Electrolyte Nanopowders for Solid-State Batteries. , 2020, ACS Applied Materials and Interfaces.

[63]  Klaus-Robert Müller,et al.  SchNet: A continuous-filter convolutional neural network for modeling quantum interactions , 2017, NIPS.

[64]  M. Head‐Gordon,et al.  NewtonNet: a Newtonian message passing network for deep learning of interatomic potentials and forces , 2021, Digital discovery.

[65]  Alexander V. Shapeev,et al.  Moment Tensor Potentials: A Class of Systematically Improvable Interatomic Potentials , 2015, Multiscale Model. Simul..

[66]  Max Welling,et al.  3D Steerable CNNs: Learning Rotationally Equivariant Features in Volumetric Data , 2018, NeurIPS.

[67]  Paolo Ruggerone,et al.  Computational Materials Science X , 2002 .

[68]  Risi Kondor,et al.  Cormorant: Covariant Molecular Neural Networks , 2019, NeurIPS.

[69]  Markus Meuwly,et al.  PhysNet: A Neural Network for Predicting Energies, Forces, Dipole Moments, and Partial Charges. , 2019, Journal of chemical theory and computation.

[70]  Michael Walter,et al.  The atomic simulation environment-a Python library for working with atoms. , 2017, Journal of physics. Condensed matter : an Institute of Physics journal.

[71]  J. Herskowitz,et al.  Proceedings of the National Academy of Sciences, USA , 1996, Current Biology.