A factor framework for experimental design for performance evaluation of commercial cloud services

Given the diversity of commercial Cloud services, performance evaluations of candidate services would be crucial and beneficial for both service customers (e.g. cost-benefit analysis) and providers (e.g. direction of service improvement). Before an evaluation implementation, the selection of suitable factors (also called parameters or variables) plays a prerequisite role in designing evaluation experiments. However, there seems a lack of systematic approaches to factor selection for Cloud services performance evaluation. In other words, evaluators randomly and intuitively concerned experimental factors in most of the existing evaluation studies. Based on our previous taxonomy and modeling work, this paper proposes a factor framework for experimental design for performance evaluation of commercial Cloud services. This framework capsules the state-of-the-practice of performance evaluation factors that people currently take into account in the Cloud Computing domain, and in turn can help facilitate designing new experiments for evaluating Cloud services.

[1]  G. Pierre,et al.  EC 2 Performance Analysis for Resource Provisioning of Service-Oriented Applications , 2009 .

[2]  Sherri L. Jackson Research Methods and Statistics: A Critical Thinking Approach , 2005 .

[3]  Shujia Zhou,et al.  Case study for running HPC applications in public clouds , 2010, HPDC '10.

[4]  Xiaowei Yang,et al.  CloudCmp: comparing public cloud providers , 2010, IMC '10.

[5]  Guillaume Pierre,et al.  EC2 Performance Analysis for Resource Provisioning of Service-Oriented Applications , 2009, ICSOC/ServiceWave Workshops.

[6]  Liam O'Brien,et al.  Towards a Taxonomy of Performance Evaluation of Commercial Cloud Services , 2012, 2012 IEEE Fifth International Conference on Cloud Computing.

[7]  Shantenu Jha,et al.  Abstractions for Loosely-Coupled and Ensemble-Based Simulations on Azure , 2010, 2010 IEEE Second International Conference on Cloud Computing Technology and Science.

[8]  Alexandru Iosup,et al.  On the Performance Variability of Production Cloud Services , 2011, 2011 11th IEEE/ACM International Symposium on Cluster, Cloud and Grid Computing.

[9]  G. Bruce Berriman,et al.  Scientific workflow applications on Amazon EC2 , 2010, 2009 5th IEEE International Conference on E-Science Workshops.

[10]  Paolo Bientinesi,et al.  HPC on Competitive Cloud Resources , 2010, Handbook of Cloud Computing.

[11]  Stephen J. Mellor,et al.  Model-driven development - Guest editor's introduction , 2003 .

[12]  A. Fox,et al.  Cloudstone : Multi-Platform , Multi-Language Benchmark and Measurement Tools for Web 2 . 0 , 2008 .

[13]  Carsten Binnig,et al.  How is the weather tomorrow?: towards a benchmark for the cloud , 2009, DBTest '09.

[14]  Radu Prodan,et al.  A survey and taxonomy of infrastructure as a service and web hosting cloud providers , 2009, 2009 10th IEEE/ACM International Conference on Grid Computing.

[15]  Gagan Agrawal,et al.  Evaluating caching and storage options on the Amazon Web Services Cloud , 2010, 2010 11th IEEE/ACM International Conference on Grid Computing.

[16]  Paul J. Fortier,et al.  Computer Systems Performance Evaluation and Prediction , 2003 .

[17]  Jie Li,et al.  Early observations on the performance of Windows Azure , 2010, HPDC '10.

[18]  Ewa Deelman,et al.  The cost of doing science on the cloud: the Montage example , 2008, HiPC 2008.

[19]  Paolo Bientinesi,et al.  Can cloud computing reach the top500? , 2009, UCHPC-MAW '09.

[20]  Christian Baun,et al.  Performance Measurement of a Private Cloud in the OpenCirrusTM Testbed , 2009, Euro-Par Workshops.

[21]  Marty Humphrey,et al.  A quantitative analysis of high performance computing with Amazon's EC2 infrastructure: The death of the local cluster? , 2009, 2009 10th IEEE/ACM International Conference on Grid Computing.

[22]  Vladimir Stantchev,et al.  Performance Evaluation of Cloud Computing Offerings , 2009, 2009 Third International Conference on Advanced Engineering Computing and Applications in Sciences.

[23]  Matei Ripeanu,et al.  Amazon S3 for science grids: a viable solution? , 2008, DADC '08.

[24]  Jiju Antony,et al.  Design of experiments for engineers and scientists , 2003 .

[25]  Liam O'Brien,et al.  On a Catalogue of Metrics for Evaluating Commercial Cloud Services , 2012, 2012 ACM/IEEE 13th International Conference on Grid Computing.

[26]  Jean-Yves Le Boudec Performance Evaluation of Computer and Communication Systems , 2010, Computer and communication sciences.

[27]  Alexandru Iosup,et al.  Performance Analysis of Cloud Computing Services for Many-Tasks Scientific Computing , 2011, IEEE Transactions on Parallel and Distributed Systems.

[28]  Simson L. Garfinkel,et al.  Commodity Grid Computing with Amazon's S3 and EC2 , 2007, login Usenix Mag..

[29]  Tore Dybå,et al.  Evidence-Based Software Engineering for Practitioners , 2005, IEEE Softw..

[30]  Raj Jain,et al.  The art of computer systems performance analysis - techniques for experimental design, measurement, simulation, and modeling , 1991, Wiley professional computing.

[31]  Simson L. Garfinkel,et al.  An Evaluation of Amazon's Grid Computing Services: EC2, S3, and SQS , 2007 .

[32]  Alexandru Iosup,et al.  A Performance Analysis of EC2 Cloud Computing Services for Scientific Computing , 2009, CloudComp.