JSBricks: a suite of microbenchmarks for the evaluation of Java as a scientific execution environment

Abstract The widely recognized praises of the Java technology have made it appealing also for the development of computation-intensive scientific and engineering programs. However, the software layers necessary to execute Java code on different platforms impose a performance pay-off that is often too costly. In order to evaluate at low level detail the support provided by the different execution environments, we have developed JSBricks (Java Scientific Bricks), a suite of microbenchmarks that implement basic algorithms which are largely used in complex scientific applications. In this paper we analyse the performance of JSBricks benchmarks on different execution environments, also providing insights into aspects that could be optimized with a view to enhancing the quality of Java code execution.