Blue Matter: Strong Scaling of Molecular Dynamics on Blue Gene/L

This paper presents strong scaling performance data for the Blue Matter molecular dynamics framework using a novel n-body spatial decomposition and a collective communications technique implemented on both MPI and low level hardware interfaces. Using Blue Matter on Blue Gene/L, we have measured scalability through 16,384 nodes with measured time per time-step of under 2.3 milliseconds for a 43,222 atom protein/lipid system. This is equivalent to a simulation rate of over 76 nanoseconds per day and represents an unprecedented time-to-solution for biomolecular simulation as well as continued speed-up to fewer than three atoms per node. On a smaller, solvated lipid system with 13,758 atoms, we have achieved continued speedups through fewer than one atom per node and less than 2 milliseconds/time-step. On a 92,224 atom system, we have achieved floating point performance of over 1.8 TeraFlops/second on 16,384 nodes. Strong scaling of fixed-size classical molecular dynamics of biological systems to large numbers of nodes is necessary to extend the simulation time to the scale required to make contact with experimental data and derive biologically relevant insights.

[1]  J. W. Perram,et al.  Simulation of electrostatic systems in periodic boundary conditions. II. Equivalence of boundary conditions , 1980, Proceedings of the Royal Society of London. A. Mathematical and Physical Sciences.

[2]  Marc Snir,et al.  A Note on N-Body Computations with Cutoffs , 2004, Theory of Computing Systems.

[3]  Philip Heidelberger,et al.  Blue Gene/L advanced diagnostics environment , 2005, IBM J. Res. Dev..

[4]  J. Perram,et al.  Simulation of electrostatic systems in periodic boundary conditions. I. Lattice sums and dielectric constants , 1980, Proceedings of the Royal Society of London. A. Mathematical and Physical Sciences.

[5]  Stephen Gilmore,et al.  Flexible Skeletal Programming with eSkel , 2005, Euro-Par.

[6]  Laxmikant V. Kale,et al.  NAMD2: Greater Scalability for Parallel Molecular Dynamics , 1999 .

[7]  Robert S. Germain,et al.  Blue matter on blue gene/L: massively parallel computation for biomolecular simulation , 2005, 2005 Third IEEE/ACM/IFIP International Conference on Hardware/Software Codesign and System Synthesis (CODES+ISSS'05).

[8]  Frank Suits,et al.  Molecular dynamics investigation of dynamical properties of phosphatidylethanolamine lipid bilayers. , 2005, The Journal of chemical physics.

[9]  William Gropp,et al.  Design and implementation of message-passing services for the Blue Gene/L supercomputer , 2005, IBM J. Res. Dev..

[10]  Frank Suits,et al.  Role of cholesterol and polyunsaturated chains in lipid-protein interactions: molecular dynamics simulation of rhodopsin in a realistic membrane environment. , 2005, Journal of the American Chemical Society.

[11]  Christian Holm,et al.  How to mesh up Ewald sums. I. A theoretical and numerical comparison of various particle mesh routines , 1998 .

[12]  Laxmikant V. Kalé,et al.  NAMD: Biomolecular Simulation on Thousands of Processors , 2002, ACM/IEEE SC 2002 Conference (SC'02).

[13]  Wilfred F. van Gunsteren,et al.  Validation of molecular dynamics simulation , 1998 .

[14]  George L.-T. Chiu,et al.  Overview of the Blue Gene/L system architecture , 2005, IBM J. Res. Dev..

[15]  Robert S. Germain,et al.  Scalable framework for 3D FFTs on the Blue Gene/L supercomputer: Implementation and early performance measurements , 2005, IBM J. Res. Dev..

[16]  Laxmikant V. Kalé,et al.  A framework for collective personalized communication , 2003, Proceedings International Parallel and Distributed Processing Symposium.

[17]  Philip Heidelberger,et al.  Blue Gene/L torus interconnection network , 2005, IBM J. Res. Dev..

[18]  Frank Suits,et al.  Molecular dynamics investigation of the structural properties of phosphatidylethanolamine lipid bilayers. , 2005, The Journal of chemical physics.

[19]  Edward D Harder,et al.  Efficient multiple time step method for use with Ewald and particle mesh Ewald for large biomolecular systems , 2001 .

[20]  David E. Shaw,et al.  A fast, scalable method for the parallel evaluation of distance‐limited pairwise particle interactions , 2005, J. Comput. Chem..

[21]  Joel S. Bader,et al.  Computer simulation study of the mean forces between ferrous and ferric ions in water , 1992 .

[22]  Robert S. Germain,et al.  Performance Measurements of the 3D FFT on the Blue Gene/L Supercomputer , 2005, Euro-Par.

[23]  Robert S. Germain,et al.  Overview of molecular dynamics techniques and early scientific results from the Blue Gene project , 2005, IBM J. Res. Dev..

[24]  Mark E. Tuckerman,et al.  Reversible multiple time scale molecular dynamics , 1992 .

[25]  Tjerk P. Straatsma,et al.  Load balancing of molecular dynamics simulation with NWChem , 2001, IBM Syst. J..

[26]  Robert S. Germain,et al.  Early performance data on the Blue Matter molecular simulation framework , 2005, IBM J. Res. Dev..

[27]  Steven J. Plimpton,et al.  A new parallel method for molecular dynamics simulation of macromolecular systems , 1994, J. Comput. Chem..

[28]  W. C. Swope,et al.  A computer simulation method for the calculation of equilibrium constants for the formation of physi , 1981 .

[29]  Robert S. Germain,et al.  Blue Matter, an application framework for molecular simulation on Blue Gene , 2003, J. Parallel Distributed Comput..

[30]  Christian Holm,et al.  How to Mesh up Ewald Sums , 2000 .

[31]  Laxmikant V. Kalé,et al.  Achieving strong scaling with NAMD on Blue Gene/L , 2006, Proceedings 20th IEEE International Parallel & Distributed Processing Symposium.

[32]  Lei Wang,et al.  Achieving Scalable Parallel Molecular Dynamics Using Dynamic Spatial Domain Decomposition Techniques , 1997, J. Parallel Distributed Comput..

[33]  Matthew H. Austern Generic programming and the STL - using and extending the C++ standard template library , 1999, Addison-Wesley professional computing series.

[34]  Ajay K. Royyuru,et al.  Blue Gene: A vision for protein science using a petaflop supercomputer , 2001, IBM Syst. J..