Towards an Efficient Performance Testing Through Dynamic Workload Adaptation

Performance testing is a critical task to ensure an acceptable user experience with software systems, especially when there are high numbers of concurrent users. Selecting an appropriate test workload is a challenging and time-consuming process that relies heavily on the testers’ expertise. Not only are workloads application-dependent, but also it is usually unclear how large a workload must be to expose any performance issues that exist in an application. Previous research has proposed to dynamically adapt the test workloads in real-time based on the application behavior. By reducing the need for the trial-and-error test cycles required when using static workloads, dynamic workload adaptation can reduce the effort and expertise needed to carry out performance testing. However, such approaches usually require testers to properly configure several parameters in order to be effective in identifying workload-dependent performance bugs, which may hinder their usability among practitioners. To address this issue, this paper examines the different criteria needed to conduct performance testing efficiently using dynamic workload adaptation. We present the results of comprehensively evaluating one such approach, providing insights into how to tune it properly in order to obtain better outcomes based on different scenarios. We also study the effects of varying its configuration and how this can affect the results obtained.

[1]  Amin Vahdat,et al.  MediSyn: a synthetic streaming media service workload generator , 2003, NOSSDAV '03.

[2]  Diwakar Krishnamurthy,et al.  A model-based approach for testing the performance of web applications , 2006, SOQUA '06.

[3]  Prashant Bansode,et al.  Performance testing guidance for web applications: patterns & practices , 2007 .

[4]  Patricia J. Teller,et al.  Making Performance Analysis and Tuning Part of the Software Development Cycle , 2009, 2009 DoD High Performance Computing Modernization Program Users Group Conference.

[5]  Erik R. Altman,et al.  Performance analysis of idle programs , 2010, OOPSLA.

[6]  Zhen Ming Jiang,et al.  Automated analysis of load testing results , 2010, ISSTA '10.

[7]  Performance analysis of idle programs , 2010, OOPSLA.

[8]  J. Chi Better early than never: vertebroplasty for acute compression fractures effective. , 2011, Neurosurgery.

[9]  John Murphy,et al.  Leverage of extended information to enhance the performance of JEE systems , 2012, ICIT 2012.

[10]  Patrick O'Sullivan,et al.  GcLite: An Expert Tool for Analyzing Garbage Collection Behavior , 2012, 2012 IEEE 36th Annual Computer Software and Applications Conference Workshops.

[11]  Tao Yu,et al.  A Self-Optimizing Workload Management Solution for Cloud Applications , 2013, 2013 IEEE 20th International Conference on Web Services.

[12]  R. Klevit,et al.  One size does not fit all: The oligomeric states of αB crystallin , 2013, FEBS letters.

[13]  A. Omar Portillo-Dominguez,et al.  Automated WAIT for Cloud-Based Application Testing , 2014, 2014 IEEE Seventh International Conference on Software Testing, Verification and Validation Workshops.

[14]  Muhammad Waqar Aziz,et al.  Test-Data Generation for Testing Parallel Real-Time Systems , 2015, ICTSS.

[15]  Miroslav Bures Metrics for automated testability of web applications , 2015, CompSysTech '15.

[16]  Amin Vahdat,et al.  Achieving cost-efficient, data-intensive computing in the cloud , 2015, SoCC.

[17]  Rajender Singh Chhillar,et al.  Software Test Process, Testing Types and Techniques , 2015 .

[18]  Kaladhar Voruganti,et al.  Storage Workload Identification , 2016, ACM Trans. Storage.

[19]  Moez Krichen,et al.  A Model Based Approach to Combine Load and Functional Tests for Service Oriented Architectures , 2016, VECoS.

[20]  Qi Luo,et al.  Enhancing Rules For Cloud Resource Provisioning Via Learned Software Performance Models , 2016, ICPE.

[21]  Introduction to the Special Issue on MSST 2015 , 2016, ACM Trans. Storage.

[22]  Qi Luo,et al.  FOREPOST: A Tool for Detecting Performance Problems with Feedback-Driven Learning Software Testing , 2016, 2016 IEEE/ACM 38th International Conference on Software Engineering Companion (ICSE-C).

[23]  Frank Siqueira,et al.  An architecture to automate performance tests on microservices , 2016, iiWAS.

[24]  A. Omar Portillo-Dominguez,et al.  Improving the Testing of Clustered Systems Through the Effective Usage of Java Benchmarks , 2017, 2017 5th International Conference in Software Engineering Research and Innovation (CONISOFT).

[25]  Sergio Segura,et al.  Performance Metamorphic Testing: Motivation and Challenges , 2017, 2017 IEEE/ACM 39th International Conference on Software Engineering: New Ideas and Emerging Technologies Results Track (ICSE-NIER).

[26]  Bernhard Rumpe,et al.  Improving Model-Based Testing in Automotive Software Engineering , 2017, 2018 IEEE/ACM 40th International Conference on Software Engineering: Software Engineering in Practice Track (ICSE-SEIP).

[27]  Olaf Zimmermann,et al.  Reengineering Data-Centric Information Systems for the Cloud – A Method and Architectural Patterns Promoting Multitenancy , 2017 .

[28]  Ahmed E. Hassan,et al.  Analytics-Driven Load Testing: An Industrial Experience Report on Load Testing of Large-Scale Systems , 2017, 2017 IEEE/ACM 39th International Conference on Software Engineering: Software Engineering in Practice Track (ICSE-SEIP).

[29]  Misbah Mubarak,et al.  Durango: Scalable Synthetic Workload Generation for Extreme-Scale Application Performance Modeling and Simulation , 2017, SIGSIM-PADS.

[30]  Jan Peleska,et al.  Safety-Complete Test Suites , 2017, ICTSS.

[31]  A. Omar Portillo-Dominguez,et al.  PHOEBE: an automation framework for the effective usage of diagnosis tools in the performance testing of clustered systems , 2017, Softw. Pract. Exp..

[32]  A. Omar Portillo-Dominguez,et al.  In-Test Adaptation of Workload in Enterprise Application Performance Testing , 2017, ICPE Companion.

[33]  Raghu Ramakrishnan,et al.  Setting Realistic Think Times in Performance Testing: A Practitioner's Approach , 2017, ISEC.

[34]  John Murphy,et al.  One Size Does Not Fit All: In-Test Workload Adaptation for Performance Testing of Enterprise Applications , 2018, ICPE.

[35]  Winfried Dulz A Versatile Tool Environment to Perform Model-based Testing of Web Applications and Multilingual Websites , 2018, ICSOFT.

[36]  Mariam Lahami,et al.  A model-based approach to combine conformance and load tests: an eHealth case study , 2018 .

[37]  Sergio Segura,et al.  Performance mutation testing: Hypothesis and open questions , 2018, Inf. Softw. Technol..

[38]  Sergio Segura,et al.  Automated inference of likely metamorphic relations for model transformations , 2018, J. Syst. Softw..

[39]  Stefan Kühne,et al.  Better Early Than Never: Performance Test Acceleration by Regression Test Selection , 2018, ICPE Companion.

[40]  Salwa K. Abd-El-Hafiz,et al.  Clustering-based Under-sampling for Software Defect Prediction , 2018 .

[41]  Ana Pont,et al.  Workload Generators for Web-Based Systems: Characteristics, Current Status, and Challenges , 2018, IEEE Communications Surveys & Tutorials.

[42]  Mariam Lahami,et al.  A model-based approach to combine conformance and load tests: an eHealth case study , 2018, Int. J. Crit. Comput. Based Syst..

[43]  Cesare Pautasso,et al.  A Declarative Approach for Performance Tests Execution in Continuous Software Development Environments , 2018, ICPE.

[44]  Stefan Kühne,et al.  How to Detect Performance Changes in Software History: Performance Analysis of Software System Versions , 2018, ICPE Companion.

[45]  Bernhard Rumpe,et al.  SMArDT modeling for automotive software testing , 2018, Softw. Pract. Exp..