Quality Assurance in Performance: Evaluating Mono Benchmark Results

Performance is an important aspect of software quality. To prevent performance degradation during software development, performance can be monitored and software modifications that damage performance can be reverted or optimized. Regression benchmarking provides means for an automated monitoring of performance, yielding a list of software modifications potentially associated with performance changes. We focus on locating individual modifications as causes of individual performance changes and present three methods that help narrow down the list of modifications potentially associated with a performance change. We illustrate the entire process on a real world project.

[1]  Ron Jeffries,et al.  Extreme Programming Installed , 2000 .

[2]  Petr Tuma,et al.  Repeated results analysis for middleware regression benchmarking , 2005, Perform. Evaluation.

[3]  Petr Tuma,et al.  Benchmark Precision and Random Initial State , 2005 .

[4]  Alan Mink,et al.  An Automated Benchmarking Toolset , 2000, HPCN Europe.

[5]  Paul Strauss,et al.  Novell Inc. , 1993 .

[6]  Ecma,et al.  Common Language Infrastructure (CLI) , 2001 .

[7]  Dayong Gu,et al.  Code Layout as a Source of Noise in JVM Performance , 2005, Stud. Inform. Univ..

[8]  David Finkel,et al.  Book review: The Art of Computer Systems Performance Analysis by R. Jain (Wiley-Interscience, 1991) , 1990, PERV.

[9]  Matthias Hauswirth,et al.  The Need for a Whole-System View of Performance , 2005, Stud. Inform. Univ..

[10]  Petr Tuma,et al.  Regression benchmarking with simple middleware benchmarks , 2004, IEEE International Conference on Performance, Computing, and Communications, 2004.

[11]  Emmanuel Cecchet,et al.  CLIF is a Load Injection Framework , 2003 .

[12]  Douglas C. Schmidt,et al.  Skoll: distributed continuous quality assurance , 2004, Proceedings. 26th International Conference on Software Engineering.

[13]  Jan Vitek,et al.  RTJBench: A Real-Time Java Benchmarking Framework , 2005, Stud. Inform. Univ..

[14]  T. Kalibera,et al.  Mono Regression Benchmarking , 2005 .

[15]  Petr Tuma,et al.  Generic Environment for Full Automation of Benchmarking , 2004, SOQUA/TECOS.

[16]  Nigel P. Topham,et al.  Performance of the decoupled ACRI-1 architecture: the perfect club , 1995, HPCN Europe.

[17]  Raj Jain,et al.  The art of computer systems performance analysis - techniques for experimental design, measurement, simulation, and modeling , 1991, Wiley professional computing.

[18]  Petr Tuma,et al.  CORBA benchmarking: a course with hidden obstacles , 2003, Proceedings International Parallel and Distributed Processing Symposium.