Tensor-reduced atomic density representations

Density-based representations of atomic environments that are invariant under Euclidean symmetries have become a widely used tool in the machine learning of interatomic potentials, broader data-driven atomistic modeling, and the visualization and analysis of material datasets. The standard mechanism used to incorporate chemical element information is to create separate densities for each element and form tensor products between them. This leads to a steep scaling in the size of the representation as the number of elements increases. Graph neural networks, which do not explicitly use density representations, escape this scaling by mapping the chemical element information into a fixed dimensional space in a learnable way. By exploiting symmetry, we recast this approach as tensor factorization of the standard neighbour-density-based descriptors and, using a new notation, identify connections to existing compression algorithms. In doing so, we form compact tensor-reduced representation of the local atomic environment whose size does not depend on the number of chemical elements, is systematically convergable, and therefore remains applicable to a wide range of data analysis and regression tasks.

[1]  Hanwen Zhang,et al.  Reaction dynamics of Diels-Alder reactions from machine learned potentials. , 2022, Physical chemistry chemical physics : PCCP.

[2]  Gábor Csányi,et al.  MACE: Higher Order Equivariant Message Passing Neural Networks for Fast and Accurate Force Fields , 2022, NeurIPS.

[3]  Felix A Faber,et al.  GPU-accelerated approximate kernel method for quantum machine learning. , 2022, The Journal of chemical physics.

[4]  Simon L. Batzner,et al.  The Design Space of E(3)-Equivariant Atom-Centered Interatomic Potentials , 2022, ArXiv.

[5]  James P. Darby,et al.  Compressing local atomic neighbourhood descriptors , 2021, npj Computational Materials.

[6]  A. Michaelides,et al.  The first-principles phase diagram of monolayer nanoconfined water , 2021, Nature.

[7]  Volker L. Deringer,et al.  Gaussian Process Regression for Materials and Molecules , 2021, Chemical reviews.

[8]  Ce Zhu,et al.  Tensor Computation for Data Analysis , 2021 .

[9]  G. Schneider,et al.  QMugs, quantum mechanical properties of drug-like molecules , 2021, Scientific Data.

[10]  Cas van der Oord,et al.  Linear Atomic Cluster Expansion Force Fields for Organic Molecules: Beyond RMSE , 2021, Journal of chemical theory and computation.

[11]  Toshiki Kataoka,et al.  Towards universal neural network potential for material discovery applicable to arbitrary combination of 45 elements , 2021, Nature Communications.

[12]  K. Nordlund,et al.  Modeling refractory high-entropy alloys with efficient machine-learned interatomic potentials: Defects and segregation , 2021, Physical Review B.

[13]  Stefano de Gironcoli,et al.  Compact atomic descriptors enable accurate predictions via linear models. , 2021, The Journal of chemical physics.

[14]  Michele Ceriotti,et al.  Optimal radial basis for density-based atomic representations , 2021, The Journal of chemical physics.

[15]  A. Laio,et al.  Ranking the information content of distance measures , 2021, PNAS nexus.

[16]  Gábor Csányi,et al.  Physics-Inspired Structural Representations for Molecules and Materials. , 2021, Chemical reviews.

[17]  Jonathan P. Mailoa,et al.  E(3)-equivariant graph neural networks for data-efficient and accurate interatomic potentials , 2021, Nature Communications.

[18]  Volker L. Deringer,et al.  Origins of structural and electronic transitions in disordered silicon , 2021, Nature.

[19]  Volker L. Deringer,et al.  A general-purpose machine-learning force field for bulk and nanostructured phosphorus , 2020, Nature Communications.

[20]  O. Anatole von Lilienfeld,et al.  On the role of gradients for machine learning of molecular energies and forces , 2020, Mach. Learn. Sci. Technol..

[21]  Alexander V. Shapeev,et al.  The MLIP package: moment tensor potentials with MPI and active learning , 2020, Mach. Learn. Sci. Technol..

[22]  Johannes Kästner,et al.  Gaussian Moments as Physically Inspired Molecular Descriptors for Accurate and Scalable Machine Learning Potentials. , 2020, Journal of chemical theory and computation.

[23]  Jigyasa Nigam,et al.  Recursive evaluation and iterative contraction of N-body equivariant features. , 2020, The Journal of chemical physics.

[24]  Ju Li,et al.  TeaNet: universal neural network interatomic potential inspired by iterative electronic relaxations , 2019, Computational Materials Science.

[25]  Cas van der Oord,et al.  Atomic cluster expansion: Completeness, efficiency and stability , 2019, J. Comput. Phys..

[26]  Miguel A. Caro,et al.  Optimizing many-body atomic descriptors for enhanced computational performance of machine learning based interatomic potentials , 2019, Physical Review B.

[27]  Ralf Drautz,et al.  Atomic cluster expansion for accurate and transferable interatomic potentials , 2019, Physical Review B.

[28]  Fritz Körmann,et al.  Impact of lattice relaxations on phase transitions in a high-entropy alloy studied by machine-learning potentials , 2018, npj Computational Materials.

[29]  C. Bannwarth,et al.  GFN2-xTB-An Accurate and Broadly Parametrized Self-Consistent Tight-Binding Quantum Chemical Method with Multipole Electrostatics and Density-Dependent Dispersion Contributions. , 2018, Journal of chemical theory and computation.

[30]  Michael J. Willatt,et al.  Feature optimization for atomistic machine learning yields a data-driven construction of the periodic table of the elements. , 2018, Physical chemistry chemical physics : PCCP.

[31]  Gus L. W. Hart,et al.  Accelerating high-throughput searches for new alloys with active learning of interatomic potentials , 2018, Computational Materials Science.

[32]  E Weinan,et al.  End-to-end Symmetry Preserving Inter-atomic Potential Energy Model for Finite and Extended Systems , 2018, NeurIPS.

[33]  M Gastegger,et al.  wACSF-Weighted atom-centered symmetry functions as descriptors in machine learning potentials. , 2017, The Journal of chemical physics.

[34]  Klaus-Robert Müller,et al.  SchNet: A continuous-filter convolutional neural network for modeling quantum interactions , 2017, NIPS.

[35]  Gerbrand Ceder,et al.  Efficient and accurate machine-learning interpolation of atomic energies in compositions with many species , 2017, 1706.06293.

[36]  Samuel S. Schoenholz,et al.  Neural Message Passing for Quantum Chemistry , 2017, ICML.

[37]  David P. Woodruff Sketching as a Tool for Numerical Linear Algebra , 2014, Found. Trends Theor. Comput. Sci..

[38]  Christian Trott,et al.  Spectral neighbor analysis method for automated generation of quantum-accurate interatomic potentials , 2014, J. Comput. Phys..

[39]  Ata Kabán,et al.  A New Look at Compressed Ordinary Least Squares , 2013, 2013 IEEE 13th International Conference on Data Mining Workshops.

[40]  Rasmus Pagh,et al.  Fast and scalable polynomial kernels via explicit feature maps , 2013, KDD.

[41]  R. Kondor,et al.  On representing chemical environments , 2012, 1209.3140.

[42]  Rasmus Pagh,et al.  Compressed matrix multiplication , 2011, ITCS '12.

[43]  J. Warringer,et al.  The HOG Pathway Dictates the Short-Term Translational Response after Hyperosmotic Shock , 2010, Molecular biology of the cell.

[44]  Rémi Munos,et al.  Compressed Least-Squares Regression , 2009, NIPS.

[45]  R. Kondor,et al.  Gaussian approximation potentials: the accuracy of quantum mechanics, without the electrons. , 2009, Physical review letters.

[46]  Song Yao,et al.  Sponsored Search Auctions: Research Opportunities in Marketing , 2009 .

[47]  Gus L. W. Hart,et al.  Algorithm for Generating Derivative Structures , 2008 .

[48]  Gene H. Golub,et al.  Symmetric Tensors and Symmetric Tensor Rank , 2008, SIAM J. Matrix Anal. Appl..

[49]  Benjamin Recht,et al.  Random Features for Large-Scale Kernel Machines , 2007, NIPS.

[50]  Michele Parrinello,et al.  Generalized neural-network representation of high-dimensional potential-energy surfaces. , 2007, Physical review letters.

[51]  Heikki Mannila,et al.  Random projection in dimensionality reduction: applications to image and text data , 2001, KDD '01.

[52]  Sanjoy Dasgupta,et al.  Experiments with Random Projection , 2000, UAI.

[53]  S. Ejaz Ahmed,et al.  Big and complex data analysis: methodologies and applications , 2017 .

[54]  S. Ahmed,et al.  Big and Complex Data Analysis , 2017 .

[55]  M. Hasselmo,et al.  Gaussian Processes for Regression , 1995, NIPS.

[56]  W. B. Johnson,et al.  Extensions of Lipschitz mappings into Hilbert space , 1984 .

[57]  A. Beck,et al.  Conference on Modern Analysis and Probability , 1984 .

[58]  25th Annual Conference on Learning Theory Random Design Analysis of Ridge Regression , 2022 .

[59]  I. Miyazaki,et al.  AND T , 2022 .

[60]  and as an in , 2022 .