Towards Distributed Petascale Computing

In this chapter we will argue that studying such multi-scale multi-science systems gives rise to inherently hybrid models containing many different algorithms best serviced by different types of computing environments (ranging from massively parallel computers, via large-scale special purpose machines to clusters of PC's) whose total integrated computing capacity can easily reach the PFlop/s scale. Such hybrid models, in combination with the by now inherently distributed nature of the data on which the models `feed' suggest a distributed computing model, where parts of the multi-scale multi-science model are executed on the most suitable computing environment, and/or where the computations are carried out close to the required data (i.e. bring the computations to the data instead of the other way around). We presents an estimate for the compute requirements to simulate the Galaxy as a typical example of a multi-scale multi-physics application, requiring distributed Petaflop/s computational power.

[1]  Elias N. Houstis,et al.  Complex problem-solving environments for Grid computing , 2005, Future Gener. Comput. Syst..

[2]  Piet Hut,et al.  A hierarchical O(N log N) force-calculation algorithm , 1986, Nature.

[3]  David Abramson,et al.  Parameter scan of an effective group difference pseudopotential using grid computing , 2009, New Generation Computing.

[4]  Christopher A. Tout,et al.  The distribution of visual binaries with two bright components , 1989 .

[5]  Peter F. Davies,et al.  Shear Stress Biology of the Endothelium , 2005, Annals of Biomedical Engineering.

[6]  Matthew R. Bate,et al.  The Origin of the Initial Mass Function and Its Dependence on the Mean Jeans Mass in Molecular Clouds , 2004 .

[7]  Christopher A. Tout,et al.  The low-luminosity stellar mass function , 1990 .

[8]  Lynn Eaton Application tools , 2008, BMJ : British Medical Journal.

[9]  J. Q. Broughton,et al.  Concurrent coupling of length scales: Methodology and application , 1999 .

[10]  P. Kroupa,et al.  The impact of mass loss on star cluster formation – II. Numerical N-body integration and further applications , 2002, astro-ph/0211026.

[11]  Piet Hut Dense stellar systems as laboratories for fundamental physics , 2006 .

[12]  Junichiro Makino,et al.  Bottlenecks in simulations of dense stellar systems , 1990 .

[13]  P. Padoan,et al.  The Stellar Initial Mass Function from Turbulent Fragmentation , 2000, astro-ph/0011465.

[14]  Simon P Goodwin The initial conditions of young globular clusters in the Large Magellanic Cloud , 1997 .

[15]  Matthew R. Bate,et al.  The thermodynamics of collapsing molecular cloud cores using smoothed particle hydrodynamics with radiative transfer , 2006 .

[16]  James Hetherington,et al.  Computational challenges of systems biology , 2004, Computer.

[17]  Peter J. Hunter,et al.  Multiscale modeling: physiome project standards, tools, and databases , 2006, Computer.

[18]  Peter M. A. Sloot,et al.  Introducing Grid Speedup G: A Scalability Metric for Parallel Applications on the Grid , 2005, EGC.

[19]  Ian T. Foster,et al.  The anatomy of the grid: enabling scalable virtual organizations , 2001, Proceedings First IEEE/ACM International Symposium on Cluster Computing and the Grid.

[20]  P. Hunter,et al.  Integration from proteins to organs: the Physiome Project , 2003, Nature Reviews Molecular Cell Biology.

[21]  Peter M. A. Sloot,et al.  A Grid-Based Hiv Expert System , 2005, Journal of Clinical Monitoring and Computing.

[22]  Marian Bubak,et al.  From molecule to man: Decision support in individualized E-health , 2006, Computer.

[23]  Denis Noble,et al.  Modulatory effect of calmodulin-dependent kinase II (CaMKII) on sarcoplasmic reticulum Ca2+ handling and interval–force relations: a modelling study , 2006, Philosophical Transactions of the Royal Society A: Mathematical, Physical and Engineering Sciences.

[25]  Simon Portegies Zwart,et al.  Performance of a Parallel Astrophysical N-Body Solver on Pan-European Computational Grids , 2005, EGC.

[26]  S. Aarseth From NBODY1 to NBODY6: The Growth of an Industry , 1999 .

[27]  Marian Bubak,et al.  Architecture of the Grid for Interactive Applications , 2003, International Conference on Computational Science.

[28]  Simon Portegies Zwart,et al.  Performance analysis of direct N-body algorithms for astrophysical simulations on distributed systems , 2007, Parallel Comput..

[29]  Lois C. McInnes,et al.  Published online in Wiley InterScience (www.interscience.wiley.com). DOI: 10.1002/cpe.911 , 2022 .

[30]  Bartosz Balis,et al.  Grid environment for on-line application monitoring and performance analysis , 2004, Sci. Program..

[31]  John Shalf,et al.  Classifying and enabling Grid Applications , 2003 .

[32]  D. Noble Modeling the Heart--from Genes to Cells to the Whole Organ , 2002, Science.

[33]  R. Spurzem Direct Simulation of Dense Stellar Systems with GRAPE-6 , 2000 .

[34]  Edward A. Lee,et al.  Scientific workflow management and the Kepler system , 2006, Concurr. Comput. Pract. Exp..

[35]  V. V. Krzhizhanovskaya,et al.  Grid-Based Simulation of Industrial Thin-Film Production , 2005, Simul..

[36]  Katarzyna Rycerz,et al.  A Framework for HLA-Based Interactive Simulations on the Grid , 2005, Simul..

[37]  Matthew R. Bate,et al.  The origin of the initial mass function and its dependence on the mean Jeans mass in molecular clouds , 2005 .

[38]  A. Barabasi,et al.  Taming complexity , 2005 .

[39]  Toshikazu Ebisuzaki,et al.  Performance Analysis of High-Accuracy Tree Code Based on the Pseudoparticle Multipole Method , 2004 .

[40]  Simon F. Portegies Zwart,et al.  Disruption time scales of star clusters in different galaxies , 2005 .

[41]  Junichiro Makino,et al.  Star cluster ecology-IV. Dissection of an open star cluster: photometry , 2001 .

[42]  Junichiro Makino,et al.  Performance analysis of direct N-body calculations , 1988 .

[43]  E Weinan,et al.  Heterogeneous multiscale methods: A review , 2007 .

[44]  Katarzyna Rycerz,et al.  Workflow composer and service registry for grid applications , 2005, Future Gener. Comput. Syst..

[45]  J. R. Hurley,et al.  Comprehensive analytic formulae for stellar evolution as a function of mass and metallicity , 2000, astro-ph/0001295.

[46]  Barbara Di Ventura,et al.  From in vivo to in silico biology and back , 2006, Nature.

[47]  Junichiro Makino,et al.  GRAPE AND PROJECT MILKYWAY , 2005 .

[48]  P. Eggleton Evolutionary Processes in Binary and Multiple Stars , 2006 .

[49]  Domenico Talia,et al.  Grid Programming Models: Current Tools, Issues and Directions , 2003 .

[50]  Wil M.P. van der Aalst,et al.  YAWL: yet another workflow language , 2005, Inf. Syst..

[51]  E. Rennan Pekünlü,et al.  Collapse of Interstellar Molecular Clouds , 2002 .

[52]  Sverre J. Aarseth,et al.  Computer Simulations of Stellar Systems , 1975 .

[53]  P Hunter,et al.  GENE EXPRESSION OF STRETCH‐ACTIVATED CHANNELS AND MECHANOELECTRIC FEEDBACK IN THE HEART , 2006, Clinical and experimental pharmacology & physiology.

[54]  R. V. van Nieuwpoort,et al.  The Grid 2: Blueprint for a New Computing Infrastructure , 2003 .

[55]  J. Gemmeke,et al.  Detecting irregular orbits in gravitational N-body simulations , 2006, astro-ph/0607343.

[56]  Anne E. Trefethen,et al.  The Data Deluge: An e-Science Perspective , 2003 .