Cloud Performance Modeling with Benchmark Evaluation of Elastic Scaling Strategies

In this paper, we present generic cloud performance models for evaluating Iaas, PaaS, SaaS, and mashup or hybrid clouds. We test clouds with real-life benchmark programs and propose some new performance metrics. Our benchmark experiments are conducted mainly on IaaS cloud platforms over scale-out and scale-up workloads. Cloud benchmarking results are analyzed with the efficiency, elasticity, QoS, productivity, and scalability of cloud performance. Five cloud benchmarks were tested on Amazon IaaS EC2 cloud: namely YCSB, CloudSuite, HiBench, BenchClouds, and TPC-W. To satisfy production services, the choice of scale-up or scale-out solutions should be made primarily by the workload patterns and resources utilization rates required. Scaling-out machine instances have much lower overhead than those experienced in scale-up experiments. However, scaling up is found more cost-effective in sustaining heavier workload. The cloud productivity is greatly attributed to system elasticity, efficiency, QoS and scalability. We find that auto-scaling is easy to implement but tends to over provision the resources. Lower resource utilization rate may result from auto-scaling, compared with using scale-out or scale-up strategies. We also demonstrate that the proposed cloud performance models are applicable to evaluate PaaS, SaaS and hybrid clouds as well.

[1]  Alexandru Iosup,et al.  C-Meter: A Framework for Performance Analysis of Computing Clouds , 2009, 2009 9th IEEE/ACM International Symposium on Cluster Computing and the Grid.

[2]  J. L. Martin,et al.  Computer benchmarking: Paths and pitfalls: The most popular way of rating computer performance can confuse as well as inform; avoid misunderstanding by asking just what the benchmark is measuring , 1987, IEEE Spectrum.

[3]  Anees Shaikh,et al.  A Cost-Aware Elasticity Provisioning System for the Cloud , 2011, 2011 31st International Conference on Distributed Computing Systems.

[4]  Wei-Tek Tsai,et al.  A Framework for Contract-Based Collaborative Verification and Validation of Web Services , 2007, CBSE.

[5]  Alexandru Iosup,et al.  A Performance Analysis of EC2 Cloud Computing Services for Scientific Computing , 2009, CloudComp.

[6]  Wei-Tek Tsai,et al.  Cloud Testing- Issues, Challenges, Needs and Practice , 2011 .

[7]  Alexandru Iosup,et al.  Benchmarking in the Cloud: What It Should, Can, and Cannot Be , 2012, TPCTC.

[8]  Liam O'Brien,et al.  On a Catalogue of Metrics for Evaluating Commercial Cloud Services , 2012, 2012 ACM/IEEE 13th International Conference on Grid Computing.

[9]  DongarraJack,et al.  Computer benchmarking: paths and pitfalls , 1987 .

[10]  Samuel Kounev,et al.  Elasticity in Cloud Computing: What It Is, and What It Is Not , 2013, ICAC.

[11]  Vipin Kumar,et al.  Performance Properties of Large Scale Parallel Systems , 1993, J. Parallel Distributed Comput..

[12]  Rouven Krebs,et al.  Metrics and techniques for quantifying performance isolation in cloud environments , 2012, QoSA '12.

[13]  Mark D. Hill,et al.  What is scalability? , 1990, CARN.

[14]  Maged M. Michael,et al.  Scale-up x Scale-out: A Case Study using Nutch/Lucene , 2007, 2007 IEEE International Parallel and Distributed Processing Symposium.

[15]  Andre B. Bondi,et al.  Characteristics of scalability and their impact on performance , 2000, WOSP '00.

[16]  Xiaowei Yang,et al.  CloudCmp: comparing public cloud providers , 2010, IMC '10.

[17]  Adam Silberstein,et al.  Benchmarking cloud serving systems with YCSB , 2010, SoCC '10.

[18]  Babak Falsafi,et al.  Clearing the clouds: a study of emerging scale-out workloads on modern hardware , 2012, ASPLOS XVII.

[19]  Antony I. T. Rowstron,et al.  Scale-up vs scale-out for Hadoop: time to rethink? , 2013, SoCC.

[20]  Kenli Li,et al.  Optimal Multiserver Configuration for Profit Maximization in Cloud Computing , 2013, IEEE Transactions on Parallel and Distributed Systems.

[21]  Rajkumar Buyya,et al.  Article in Press Future Generation Computer Systems ( ) – Future Generation Computer Systems Cloud Computing and Emerging It Platforms: Vision, Hype, and Reality for Delivering Computing as the 5th Utility , 2022 .

[22]  A. Fox,et al.  Cloudstone : Multi-Platform , Multi-Language Benchmark and Measurement Tools for Web 2 . 0 , 2008 .

[23]  Wei-Tek Tsai,et al.  Testing the scalability of SaaS applications , 2011, 2011 IEEE International Conference on Service-Oriented Computing and Applications (SOCA).

[24]  Archana Ganapathi,et al.  The Case for Evaluating MapReduce Performance Using Workload Suites , 2011, 2011 IEEE 19th Annual International Symposium on Modelling, Analysis, and Simulation of Computer and Telecommunication Systems.

[25]  Xiaoying Bai,et al.  Evaluating services on the cloud using ontology QoS model , 2011, Proceedings of 2011 IEEE 6th International Symposium on Service Oriented System (SOSE).

[26]  Carsten Binnig,et al.  How is the weather tomorrow?: towards a benchmark for the cloud , 2009, DBTest '09.

[27]  Alexandru Iosup,et al.  Performance Analysis of Cloud Computing Services for Many-Tasks Scientific Computing , 2011, IEEE Transactions on Parallel and Distributed Systems.

[28]  Jack Dongarra,et al.  Computer benchmarking: paths and pitfalls , 1987 .

[29]  Kai Hwang,et al.  Scale-Out vs. Scale-Up Techniques for Cloud Performance and Productivity , 2014, 2014 IEEE 6th International Conference on Cloud Computing Technology and Science.

[30]  P. Mell,et al.  The NIST Definition of Cloud Computing , 2011 .