Reweighted autoencoded variational Bayes for enhanced sampling (RAVE).

Here we propose the reweighted autoencoded variational Bayes for enhanced sampling (RAVE) method, a new iterative scheme that uses the deep learning framework of variational autoencoders to enhance sampling in molecular simulations. RAVE involves iterations between molecular simulations and deep learning in order to produce an increasingly accurate probability distribution along a low-dimensional latent space that captures the key features of the molecular simulation trajectory. Using the Kullback-Leibler divergence between this latent space distribution and the distribution of various trial reaction coordinates sampled from the molecular simulation, RAVE determines an optimum, yet nonetheless physically interpretable, reaction coordinate and optimum probability distribution. Both then directly serve as the biasing protocol for a new biased simulation, which is once again fed into the deep learning module with appropriate weights accounting for the bias, the procedure continuing until estimates of desirable thermodynamic observables are converged. Unlike recent methods using deep learning for enhanced sampling purposes, RAVE stands out in that (a) it naturally produces a physically interpretable reaction coordinate, (b) is independent of existing enhanced sampling protocols to enhance the fluctuations along the latent space identified via deep learning, and (c) it provides the ability to easily filter out spurious solutions learned by the deep learning procedure. The usefulness and reliability of RAVE is demonstrated by applying it to model potentials of increasing complexity, including computation of the binding free energy profile for a hydrophobic ligand-substrate system in explicit water with dissociation time of more than 3 min, in computer time at least twenty times less than that needed for umbrella sampling or metadynamics.

[1]  Michele Parrinello,et al.  Enhanced, targeted sampling of high-dimensional free-energy landscapes using variationally enhanced sampling, with an application to chignolin , 2016, Proceedings of the National Academy of Sciences.

[2]  Frank Noé,et al.  Time-lagged autoencoders: Deep learning of slow collective variables for molecular kinetics , 2017, The Journal of chemical physics.

[3]  Yoshua Bengio,et al.  Deep Learning of Representations: Looking Forward , 2013, SLSP.

[4]  Vijay S Pande,et al.  tICA-Metadynamics: Accelerating Metadynamics by Using Kinetically Selected Collective Variables. , 2017, Journal of chemical theory and computation.

[5]  Aaron R Dinner,et al.  Automatic method for identifying reaction coordinates in complex systems. , 2005, The journal of physical chemistry. B.

[6]  A. Laio,et al.  Escaping free-energy minima , 2002, Proceedings of the National Academy of Sciences of the United States of America.

[7]  B. Berne,et al.  Spectral gap optimization of order parameters for sampling complex molecular systems , 2015, Proceedings of the National Academy of Sciences.

[8]  M. Parrinello,et al.  Accurate sampling using Langevin dynamics. , 2007, Physical review. E, Statistical, nonlinear, and soft matter physics.

[9]  Grubmüller,et al.  Predicting slow structural transitions in macromolecular systems: Conformational flooding. , 1995, Physical review. E, Statistical physics, plasmas, fluids, and related interdisciplinary topics.

[10]  Mohammad M. Sultan,et al.  Transferable Neural Networks for Enhanced Sampling of Protein Dynamics. , 2018, Journal of chemical theory and computation.

[11]  Joseph A Morrone,et al.  How hydrophobic drying forces impact the kinetics of molecular recognition , 2013, Proceedings of the National Academy of Sciences.

[12]  K. Dill,et al.  Inferring Transition Rates of Networks from Populations in Continuous-Time Markov Processes. , 2015, Journal of chemical theory and computation.

[13]  Pratyush Tiwary,et al.  Predicting reaction coordinates in energy landscapes with diffusion anisotropy. , 2017, The Journal of chemical physics.

[14]  A. Berezhkovskii,et al.  One-dimensional reaction coordinates for diffusive activated rate processes in many dimensions. , 2005, The Journal of chemical physics.

[15]  Hao Wu,et al.  VAMPnets for deep learning of molecular kinetics , 2017, Nature Communications.

[16]  B. Berne,et al.  How wet should be the reaction coordinate for ligand unbinding? , 2016, The Journal of chemical physics.

[17]  Yuji Sugita,et al.  Neural Network and Nearest Neighbor Algorithms for Enhancing Sampling of Molecular Dynamics. , 2017, Journal of chemical theory and computation.

[18]  Pratyush Tiwary,et al.  Molecular Determinants and Bottlenecks in the Dissociation Dynamics of Biotin-Streptavidin. , 2017, The journal of physical chemistry. B.

[19]  Michele Parrinello,et al.  Enhancing Important Fluctuations: Rare Events and Metadynamics from a Conceptual Viewpoint. , 2016, Annual review of physical chemistry.

[20]  Matteo Salvalaglio,et al.  CO2 packing polymorphism under confinement in cylindrical nanopores , 2017, 1709.04586.

[21]  M Scott Shell,et al.  Coarse-graining errors and numerical optimization using a relative entropy framework. , 2011, The Journal of chemical physics.

[22]  Michele Parrinello,et al.  A variational conformational dynamics approach to the selection of collective variables in metadynamics. , 2017, The Journal of chemical physics.

[23]  Bernhardt L Trout,et al.  Nucleation of Molecular Crystals Driven by Relative Information Entropy. , 2018, Journal of chemical theory and computation.

[24]  Axel van de Walle,et al.  A Review of Enhanced Sampling Approaches for Accelerated Molecular Dynamics , 2016 .

[25]  K. Dill,et al.  Principles of maximum entropy and maximum caliber in statistical physics , 2013 .

[26]  Pratyush Tiwary,et al.  Role of water and steric constraints in the kinetics of cavity–ligand unbinding , 2015, Proceedings of the National Academy of Sciences.

[27]  Michele Parrinello,et al.  Variationally Optimized Free-Energy Flooding for Rate Calculation. , 2015, Physical review letters.

[28]  Sebastian Johann Wetzel,et al.  Unsupervised learning of phase transitions: from principal component analysis to variational autoencoders , 2017, Physical review. E.