A model-driven approach to broaden the detection of software performance antipatterns at runtime

Performance antipatterns document bad design patterns that have negative influence on system performance. In our previous work we formalized such antipatterns as logical predicates that build on four different views: (i) the static view that captures the software elements (e.g. classes, components) and the static relationships among them; (ii) the dynamic view that represents the interaction (e.g. messages) that occurs between the software elements to provide system functionalities; (iii) the deployment view that describes the hardware elements (e.g. processing nodes) and the mapping of software resources onto hardware platforms; (iv) the performance view that collects a set of specific performance indices. In this paper we present a lightweight infrastructure that enables the detection of software performance antipatterns at runtime through the monitoring of specific performance indices. The proposed approach precalculates the logical predicates of antipatterns and identifies the ones whose static, dynamic and deployment sub-predicates occur in the current system configuration and brings at runtime the verification of performance sub-predicates. The proposed infrastructure leverages model-driven techniques to generate probes for monitoring the performance sub-predicates thus to support the detection of antipatterns at runtime.

[1]  Connie U. Smith,et al.  More New Software Antipatterns: Even More Ways to Shoot Yourself in the Foot , 2003, Int. CMG Conference.

[2]  Jing Xu,et al.  Rule-based automatic software performance diagnosis and improvement , 2008, WOSP '08.

[3]  C. U. Smith More New Software Performance Antipatterns : Even More Ways to Shoot Yourself in the Foot , 2000 .

[4]  Foutse Khomh,et al.  An exploratory study of the impact of antipatterns on class change- and fault-proneness , 2011, Empirical Software Engineering.

[5]  Jing Xu,et al.  Rule-based automatic software performance diagnosis and improvement , 2008, WOSP '08.

[6]  David Garlan,et al.  Documenting software architectures: views and beyond , 2002, 25th International Conference on Software Engineering, 2003. Proceedings..

[7]  Dorina C. Petriu,et al.  The Future of Software Performance Engineering , 2007, Future of Software Engineering (FOSE '07).

[8]  John Murphy,et al.  Detecting Performance Antipatterns in Component Based Enterprise Systems , 2008, J. Object Technol..

[9]  Yann-Gaël Guéhéneuc,et al.  DECOR: A Method for the Specification and Detection of Code and Design Smells , 2010, IEEE Transactions on Software Engineering.

[10]  Vittorio Cortellessa,et al.  PRIMA-UML: a performance validation incremental methodology on early UML diagrams , 2002, Sci. Comput. Program..

[11]  Leonard Kleinrock,et al.  Queueing Systems - Vol. 1: Theory , 1975 .

[12]  Connie U. Smith,et al.  Software Performance Engineering for Oracle Applications: Measurements and Models , 2008, Int. CMG Conference.

[13]  Paola Inverardi,et al.  Model-Based Software Performance Analysis , 2011 .

[14]  Antonello Calabrò,et al.  GLIMPSE: a generic and flexible monitoring infrastructure , 2011, EWDC '11.

[15]  Ying Zou,et al.  Adapting the User Interface of Integrated Development Environments (IDEs) for Novice Users , 2008, J. Object Technol..

[16]  Jean-Marc Jézéquel,et al.  Specification and Detection of SOA Antipatterns , 2012, 2014 IEEE International Conference on Software Maintenance and Evolution.

[17]  Antonello Calabrò,et al.  Yet another meta-model to specify non-functional properties , 2011, QASBA '11.

[18]  Aniruddha S. Gokhale,et al.  Applying Model Transformations to Optimizing Real-Time QoS Configurations in DRE Systems , 2009, QoSA.

[19]  David A. Patterson,et al.  Computer Architecture - A Quantitative Approach, 5th Edition , 1996 .

[20]  Francesca Lonetti,et al.  Complex Events Specification for Properties Validation , 2012, 2012 Eighth International Conference on the Quality of Information and Communications Technology.

[21]  Foutse Khomh,et al.  BDTEX: A GQM-based Bayesian approach for the detection of antipatterns , 2011, J. Syst. Softw..

[22]  C. Murray Woodside,et al.  Heuristic Optimization of Scheduling and Allocation for Distributed Systems with Soft Deadlines , 2003, Computer Performance Evaluation / TOOLS.

[23]  Giuseppe Serazzi,et al.  Quantitative system evaluation with Java modeling tools , 2011, ICPE '11.

[24]  Raffaela Mirandola,et al.  A Deep Investigation for QoS-based Feedback at Design Time and Runtime , 2012, 2012 IEEE 17th International Conference on Engineering of Complex Computer Systems.

[25]  Forrest Shull,et al.  Detecting defects in object-oriented designs: using reading techniques to increase software quality , 1999, OOPSLA '99.

[26]  Vittorio Cortellessa,et al.  An approach for modeling and detecting software performance antipatterns based on first-order logics , 2012, Software & Systems Modeling.

[27]  Steffen Becker,et al.  Automatically improve software architecture models for performance, reliability, and cost using evolutionary algorithms , 2010, WOSP/SIPEW '10.

[28]  Foutse Khomh,et al.  Numerical Signatures of Antipatterns: An Approach Based on B-Splines , 2010, 2010 14th European Conference on Software Maintenance and Reengineering.

[29]  Antonello Calabrò,et al.  Towards a Model-Driven Infrastructure for Runtime Monitoring , 2011, SERENE.

[30]  David A. Patterson,et al.  Computer Architecture: A Quantitative Approach , 1969 .