Rule-based automatic software performance diagnosis and improvement

Performance of a software system is the result of many interacting factors. This paper describes a rule-based framework to identify root causes of performance limits, to untangle the effects of the system configuration (such as the allocation of processors) from limits imposed by the software design, and to recommend both configuration and design improvements. The framework uses a performance model which represents (and is derived from) a UML design model, and applies transformations to the given performance model to obtain another improved one. The improvements imply configuration and design changes which can be applied to the system. This paper describes the approach and demonstrates feasibility by applying a small set of rules to the design of a web application.

[1]  John J. McCarthy,et al.  The Rule Engine for the Java Platform , 2008 .

[2]  Vittorio Cortellessa,et al.  A Framework for Automated Generation of Architectural Feedback from Software Performance Analysis , 2007, EPEW.

[3]  C. Murray Woodside,et al.  From Annotated Software Designs (UML SPT/MARTE) to Model Formalisms , 2007, SFM.

[4]  Jean-Philippe Babau,et al.  From MDD Concepts to Experiments and Illustrations , 2007 .

[5]  Tomàs Margalef,et al.  Design and implementation of a dynamic tuning environment , 2007, J. Parallel Distributed Comput..

[6]  C. Murray Woodside,et al.  Interaction tree algorithms to extract effective architecture and layered performance models from traces , 2007, J. Syst. Softw..

[7]  C. Murray Woodside,et al.  An intermediate metamodel with scenarios and resources for generating performance models from UML designs , 2007, Software & Systems Modeling.

[8]  Jing Xu,et al.  Layered Bottlenecks and Their Mitigation , 2006, Third International Conference on the Quantitative Evaluation of Systems - (QEST'06).

[9]  Li Li,et al.  Model-Based Performance Diagnosis of Master-Worker Parallel Computations , 2006, Euro-Par.

[10]  Sébastien Gérard,et al.  Annotating UML Models with Non-functional Properties for Quantitative Analysis , 2005, MoDELS Satellite Events.

[11]  C. Murray Woodside,et al.  Fast estimation of probabilities of soft deadline misses in layered software performance models , 2005, WOSP '05.

[12]  Dorina C. Petriu,et al.  From UML to LQN by XML algebra-based model transformations , 2005, WOSP '05.

[13]  José Merseguer,et al.  Performance by unified model analysis (PUMA) , 2005, WOSP '05.

[14]  Darcy G. Benoit,et al.  Automatic Diagnosis of Performance Problems in Database Management Systems , 2005, Second International Conference on Autonomic Computing (ICAC'05).

[15]  C. Murray Woodside,et al.  Software performance models from system scenarios , 2005, Perform. Evaluation.

[16]  Elaine J. Weyuker,et al.  The role of modeling in the performance testing of e-commerce applications , 2004, IEEE Transactions on Software Engineering.

[17]  Stephen Gilmore,et al.  Evaluating the Performance of Skeleton-Based High Level Parallel Programs , 2004, International Conference on Computational Science.

[18]  Jing Xu,et al.  Performance Analysis of a Software Design Using the UML Profile for Schedulability, Performance, and Time , 2003, Computer Performance Evaluation / TOOLS.

[19]  Ernest Friedman-Hill,et al.  Jess in action : rule-based systems in Java , 2003 .

[20]  Dorina C. Petriu,et al.  Performance Analysis with UML , 2003, UML for Real.

[21]  Susanna Donatelli,et al.  From UML sequence diagrams and statecharts to analysable petri net models , 2002, WOSP '02.

[22]  C. Murray Woodside,et al.  Analysing software requirements specifications for performance , 2002, WOSP '02.

[23]  M. Woodside,et al.  Performance-related completions for software specifications , 2002, Proceedings of the 24th International Conference on Software Engineering. ICSE 2002.

[24]  C. Murray Woodside,et al.  Software Resource Architecture , 2001, Int. J. Softw. Eng. Knowl. Eng..

[25]  Connie U. Smith,et al.  Software performance antipatterns , 2000, WOSP '00.

[26]  Jeffrey S. Vetter Performance analysis of distributed applications using automatic classification of communication inefficiencies , 2000, ICS '00.

[27]  C. Murray Woodside,et al.  Performance analysis of distributed server systems , 2000 .

[28]  B. Miller,et al.  Improving Online Performance Diagnosis by the Use of Historical Performance Data , 1999, ACM/IEEE SC 1999 Conference (SC'99).

[29]  Tomàs Margalef,et al.  Automatic performance evaluation of parallel programs , 1998, Proceedings of the Sixth Euromicro Workshop on Parallel and Distributed Processing - PDP '98 -.

[30]  Barton P. Miller,et al.  The Paradyn Parallel Performance Measurement Tool , 1995, Computer.

[31]  C. Murray Woodside,et al.  A Three-View Model for Performance Engineering of Concurrent Software , 1995, IEEE Trans. Software Eng..

[32]  Shikharesh Majumdar,et al.  Software Bootlenecking in Client-Server Systems and Rendezvous Networks , 1995, IEEE Trans. Software Eng..

[33]  Allen D. Malony,et al.  Capturing and automating performance diagnosis: the Poirot approach , 1995, Proceedings of 9th International Parallel Processing Symposium.

[34]  Stuart C. Shapiro,et al.  Encyclopedia of artificial intelligence, vols. 1 and 2 (2nd ed.) , 1992 .

[35]  H. Kubo,et al.  A computer system performance analysis expert system (EXPERFORM) , 1990, Ninth Annual International Phoenix Conference on Computers and Communications. 1990 Conference Proceedings.

[36]  Peter Jackson,et al.  Introduction to expert systems , 1986 .