Benchmarking elasticity of FaaS platforms as a foundation for objective-driven design of serverless applications

Application providers have to solve the trade-off between performance and deployment costs by selecting the "right" amount of provisioned computing resources for their application. The high value of changing this trade-off decision at runtime fueled a decade of combined efforts by industry and research to develop elastic applications. Despite these efforts, the development of elastic applications still demands significant time and expertise from application providers. To address this demand, FaaS platforms shift responsibilities associated with elasticity from the application developer to the cloud provider. While this shift is highly promising, FaaS platforms do not quantify elasticity; thus, application developers are unaware of how elastic FaaS platforms are. This lack of knowledge significantly impairs effective objective-driven design of serverless applications. In this paper, we present an experiment design and corresponding toolkit for quantifying elasticity and its associated trade-offs with latency, reliability, and execution costs. We present results for the evaluation of four popular FaaS platforms by AWS, Google, IBM, Microsoft, and show significant differences between the service offers. Based on our results, we assess the applicability of the individual FaaS platforms in three scenarios under different objectives: web serving, online data analysis, and offline batch processing.

[1]  Peter Van Roy,et al.  Measuring Elasticity for Cloud Databases , 2011, CLOUD 2011.

[2]  Shrideep Pallickara,et al.  Serverless Computing: An Investigation of Factors Influencing Microservice Performance , 2018, 2018 IEEE International Conference on Cloud Engineering (IC2E).

[3]  David Bermbach,et al.  BenchFoundry: A Benchmarking Framework for Cloud Storage Services , 2017, ICSOC.

[4]  Ion Stoica,et al.  Occupy the cloud: distributed computing for the 99% , 2017, SoCC.

[5]  Adam Silberstein,et al.  Benchmarking cloud serving systems with YCSB , 2010, SoCC '10.

[6]  Jörn Kuhlenkamp,et al.  Costradamus: A Cost-Tracing System for Cloud-Based Software Services , 2017, ICSOC.

[7]  Ioannis Konstantinou,et al.  On the elasticity of NoSQL databases over cloud management platforms , 2011, CIKM '11.

[8]  David A. Patterson,et al.  Cloud Programming Simplified: A Berkeley View on Serverless Computing , 2019, ArXiv.

[9]  Markus Klems Experiment-driven evaluation of cloud-based distributed systems , 2016 .

[10]  Kevin Lee,et al.  How a consumer can measure elasticity for cloud platforms , 2012, ICPE '12.

[11]  Sebastian Werner,et al.  An Evaluation of FaaS Platforms as a Foundation for Serverless Big Data Processing , 2019, UCC.

[12]  Samuel Kounev,et al.  Towards a Resource Elasticity Benchmark for Cloud Environments , 2014, HotTopiCS '14.

[13]  Cristina L. Abad,et al.  A SPEC RG Cloud Group's Vision on the Performance Challenges of FaaS Cloud Architectures , 2018, ICPE Companion.

[14]  Guido Wirtz,et al.  Cold Start Influencing Factors in Function as a Service , 2018, 2018 IEEE/ACM International Conference on Utility and Cloud Computing Companion (UCC Companion).

[15]  Maciej Malawski,et al.  Benchmarking Heterogeneous Cloud Functions , 2017, Euro-Par Workshops.

[16]  Geoffrey C. Fox,et al.  Evaluation of Production Serverless Computing Environments , 2018, 2018 IEEE 11th International Conference on Cloud Computing (CLOUD).

[17]  Jörn Kuhlenkamp,et al.  Benchmarking Scalability and Elasticity of Distributed Database Systems , 2014, Proc. VLDB Endow..

[18]  Sebastian Werner,et al.  Serverless Big Data Processing using Matrix Multiplication as Example , 2018, 2018 IEEE International Conference on Big Data (Big Data).

[19]  Karl Huppler,et al.  The Art of Building a Good Benchmark , 2009, TPCTC.

[20]  Berkant Barla Cambazoglu,et al.  Impact of response latency on user behavior in web search , 2014, SIGIR.

[21]  David Jackson,et al.  An Investigation of the Impact of Language Runtime on the Performance and Cost of Serverless Functions , 2018, 2018 IEEE/ACM International Conference on Utility and Cloud Computing Companion (UCC Companion).

[22]  Marc Sánchez Artigas,et al.  Serverless Data Analytics in the IBM Cloud , 2018, Middleware Industry.

[23]  Jóakim von Kistowski,et al.  How to Build a Benchmark , 2015, ICPE.

[24]  Sebastian Werner,et al.  Benchmarking FaaS Platforms: Call for Community Participation , 2018, 2018 IEEE/ACM International Conference on Utility and Cloud Computing Companion (UCC Companion).

[25]  David Bermbach,et al.  Cloud Service Benchmarking - Measuring Quality of Cloud Services from a Client Perspective , 2017 .

[26]  Hans-Arno Jacobsen,et al.  A Distributed Analysis and Benchmarking Framework for Apache OpenWhisk Serverless Platform , 2018, Middleware.

[27]  Margaret J. Robertson,et al.  Design and Analysis of Experiments , 2006, Handbook of statistics.