Generic Environment for Full Automation of Benchmarking

Regression testing is an important part of software quality assurance. We work to extend regression testing to include regression benchmarking, which applies benchmarking to detect regressions in performance. Given the specific requirements of regression benchmarking, many contemporary benchmarks are not directly usable in regression benchmarking. To overcome this, we present a case for designing a generic benchmarking environment that will facilitate the use of contemporary benchmarks in regression benchmarking, analyze the requirements and propose architecture of such an environment.

[1]  Petr Tuma,et al.  CORBA benchmarking: a course with hidden obstacles , 2003, Proceedings International Parallel and Distributed Processing Symposium.

[2]  Alan Mink,et al.  An Automated Benchmarking Toolset , 2000, HPCN Europe.

[3]  Douglas C. Schmidt,et al.  Skoll: distributed continuous quality assurance , 2004, Proceedings. 26th International Conference on Software Engineering.

[4]  BulejLubomír,et al.  Repeated results analysis for middleware regression benchmarking , 2005 .

[5]  Petr Tuma,et al.  Regression benchmarking with simple middleware benchmarks , 2004, IEEE International Conference on Performance, Computing, and Communications, 2004.

[6]  Petr Tuma,et al.  Repeated results analysis for middleware regression benchmarking , 2005, Perform. Evaluation.

[7]  Sameh Elnikety,et al.  Performance Comparison of Middleware Architectures for Generating Dynamic Web Content , 2003, Middleware.