CASPA: A Platform for Comparability of Architecture-Based Software Performance Engineering Approaches

Setting up an experimental evaluation for architecture-based Software Performance Engineering (SPE) approaches requires enormous efforts. This includes the selection and installation of representative applications, usage profiles, supporting tools, infrastructures, etc. Quantitative comparisons with related approaches are hardly possible due to limited repeatability of previous experiments by other researchers. This paper presents CASPA, a ready-to-use and extensible evaluation platform that already includes example applications and state-of-the-art SPE components, such as monitoring and model extraction. The platform explicitly provides interfaces to replace applications and components by custom(ized) components. The platform builds on state-of-the-art technologies such as container-based virtualization.

[1]  Claes Wohlin,et al.  Experimentation in Software Engineering , 2000, The Kluwer International Series in Software Engineering.

[2]  Harald C. Gall,et al.  Using Docker Containers to Improve Reproducibility in Software Engineering Research , 2016, 2016 IEEE/ACM 38th International Conference on Software Engineering Companion (ICSE-C).

[3]  Jan Jürjens,et al.  A Platform for Empirical Research on Information System Evolution , 2015, ICSE 2015.

[4]  Wilhelm Hasselbring,et al.  Kieker: a framework for application performance monitoring and dynamic software analysis , 2012, ICPE '12.

[5]  Jan Bosch,et al.  Continuous Software Engineering , 2014, Springer International Publishing.

[6]  Robert Heinrich Architectural Run-time Models for Performance and Privacy Analysis in Dynamic Cloud Applications? , 2016, PERV.

[7]  Helmut Krcmar,et al.  Optimization of Deployment Topologies for Distributed Enterprise Applications , 2016, 2016 12th International ACM SIGSOFT Conference on Quality of Software Architectures (QoSA).

[8]  André van Hoorn,et al.  Application Performance Management: State of the Art and Challenges for the Future , 2017, ICPE.

[9]  Wilhelm Hasselbring,et al.  WESSBAS: extraction of probabilistic workload specifications for load testing and performance prediction—a model-driven approach for session-based application systems , 2016, Software & Systems Modeling.

[10]  Samuel Kounev,et al.  An Expandable Extraction Framework for Architectural Performance Models , 2017, ICPE Companion.

[11]  Heiko Koziolek,et al.  Performance evaluation of component-based software systems: A survey , 2010, Perform. Evaluation.

[12]  Heiko Koziolek,et al.  CoCoME - The Common Component Modeling Example , 2007, CoCoME.

[13]  W. Marsden I and J , 2012 .

[14]  Jan Jürjens,et al.  A Platform for Empirical Research on Information System Evolution , 2015, SEKE.

[15]  Heiko Koziolek,et al.  PerOpteryx: automated application of tactics in multi-objective software architecture optimization , 2011, QoSA-ISARCS '11.

[16]  Ahmed E. Hassan,et al.  A Survey on Load Testing of Large-Scale Software Systems , 2015, IEEE Transactions on Software Engineering.

[17]  André van Hoorn,et al.  Towards Performance Tooling Interoperability: An Open Format for Representing Execution Traces , 2016, EPEW.

[18]  Max E. Kramer,et al.  Modeling and Simulating Software Architectures: The Palladio Approach , 2016 .

[19]  Wilhelm Hasselbring,et al.  Performance-oriented DevOps: A Research Agenda , 2015, ArXiv.

[20]  Liming Zhu,et al.  DevOps - A Software Architect's Perspective , 2015, SEI series in software engineering.

[21]  Dorina C. Petriu,et al.  The Future of Software Performance Engineering , 2007, Future of Software Engineering (FOSE '07).