Instance launch-time analysis of OpenStack virtualization technologies with control plane network errors

We analyzed the performance of a multi-node OpenStack cloud amid different types of controlled and self-induced network errors between controller and compute-nodes on the control plane network. These errors included limited bandwidth, delays and packet losses of varying severity. This study compares the effects of network errors on spawning times of batches of instances created using three different virtualization technologies supported by OpenStack, i.e., Docker containers, Linux containers and KVM virtual machines. We identified minimum/maximum thresholds for bandwidth, delay and packet-loss rates below/beyond which instances fail to launch. To the authors’ best knowledge, this is the first comparative measurement study of its kind on OpenStack. The results will be of particular interest to designers and administrators of distributed OpenStack deployments.

[1]  Martín Casado,et al.  NOX: towards an operating system for networks , 2008, CCRV.

[2]  John Shalf,et al.  Performance Analysis of High Performance Computing Applications on the Amazon Web Services Cloud , 2010, 2010 IEEE Second International Conference on Cloud Computing Technology and Science.

[3]  Jorge-Arnulfo Quiané-Ruiz,et al.  Runtime measurements in the cloud , 2010, Proc. VLDB Endow..

[4]  Simson L. Garfinkel,et al.  An Evaluation of Amazon's Grid Computing Services: EC2, S3, and SQS , 2007 .

[5]  Alexandru Iosup,et al.  A Performance Analysis of EC2 Cloud Computing Services for Scientific Computing , 2009, CloudComp.

[6]  Nick McKeown,et al.  Reproducible network experiments using container-based emulation , 2012, CoNEXT '12.

[7]  Ariel Tseitlin,et al.  The antifragile organization , 2013, CACM.

[8]  Adam Silberstein,et al.  Benchmarking cloud serving systems with YCSB , 2010, SoCC '10.

[9]  David Erickson,et al.  The beacon openflow controller , 2013, HotSDN '13.

[10]  Martín Casado,et al.  Onix: A Distributed Control Platform for Large-scale Production Networks , 2010, OSDI.

[11]  M. Prange,et al.  Scientific Computing in the Cloud , 2008, Computing in Science & Engineering.

[12]  A. Fox,et al.  Cloudstone : Multi-Platform , Multi-Language Benchmark and Measurement Tools for Web 2 . 0 , 2008 .

[13]  Alexandru Iosup,et al.  Benchmarking in the Cloud: What It Should, Can, and Cannot Be , 2012, TPCTC.

[14]  Nick McKeown,et al.  A network in a laptop: rapid prototyping for software-defined networks , 2010, Hotnets-IX.

[15]  Alexandru Iosup,et al.  C-Meter: A Framework for Performance Analysis of Computing Clouds , 2009, 2009 9th IEEE/ACM International Symposium on Cluster Computing and the Grid.

[16]  Zhenyu Wu,et al.  Whispers in the Hyper-space: High-speed Covert Channel Attacks in the Cloud , 2012, USENIX Security Symposium.

[17]  Jie Li,et al.  Early observations on the performance of Windows Azure , 2010, HPDC '10.

[18]  Matti A. Hiltunen,et al.  An exploration of L2 cache covert channels in virtualized environments , 2011, CCSW '11.

[19]  Xinli Wang,et al.  Cloud computing performance benchmarking and virtual machine launch time , 2012, SIGITE '12.

[20]  Nick Feamster,et al.  The road to SDN: an intellectual history of programmable networks , 2014, CCRV.

[21]  Mohsine Eleuldj,et al.  OpenStack: Toward an Open-source Solution for Cloud Computing , 2012 .

[22]  Alexandru Iosup,et al.  Performance Analysis of Cloud Computing Services for Many-Tasks Scientific Computing , 2011, IEEE Transactions on Parallel and Distributed Systems.

[23]  Hovav Shacham,et al.  Hey, you, get off of my cloud: exploring information leakage in third-party compute clouds , 2009, CCS.

[24]  Ming Mao,et al.  A Performance Study on the VM Startup Time in the Cloud , 2012, 2012 IEEE Fifth International Conference on Cloud Computing.

[25]  Yoichi Muraoka,et al.  HPC Benchmarks on Amazon EC2 , 2010, 2010 IEEE 24th International Conference on Advanced Information Networking and Applications Workshops.

[26]  Massimiliano Rak,et al.  Benchmarks in the Cloud: The mOSAIC Benchmarking Framework , 2012, 2012 14th International Symposium on Symbolic and Numeric Algorithms for Scientific Computing.

[27]  Gang Lu,et al.  CloudRank-D: benchmarking and ranking cloud computing systems for data processing applications , 2012, Frontiers of Computer Science.

[28]  Srikanth Kandula,et al.  CloudProphet: towards application performance prediction in cloud , 2011, SIGCOMM 2011.

[29]  Carsten Binnig,et al.  How is the weather tomorrow?: towards a benchmark for the cloud , 2009, DBTest '09.

[30]  Alexander L. Wolf,et al.  NaaS: Network-as-a-Service in the Cloud , 2012, Hot-ICE.

[31]  Rafael Moreno-Vozmediano,et al.  Elastic management of cluster-based services in the cloud , 2009, ACDC '09.

[32]  Thierry Turletti,et al.  A Survey of Software-Defined Networking: Past, Present, and Future of Programmable Networks , 2014, IEEE Communications Surveys & Tutorials.

[33]  Lee Gillam,et al.  Towards Performance Prediction for Public Infrastructure Clouds: An EC2 Case Study , 2013, 2013 IEEE 5th International Conference on Cloud Computing Technology and Science.

[34]  Junaid Qadir,et al.  A measurement study of open source SDN layers in OpenStack under network perturbation , 2017, Comput. Commun..