Understanding Performance Interference Benchmarking and Application Profiling Techniques for Cloud-hosted Latency-Sensitive Applications

Modern data centers are composed of heterogeneous servers with different architectures, processor counts, number of cores and speed. They also exhibit variability in memory speed and size, storage type and size and network connectivity. In addition, the servers are multi-tenant, often hosting latency sensitive applications in addition to the traditional batch processing applications. To provide bounded and predictable latencies, it is necessary for the cloud providers to understand the performance interplay among the co-hosted applications. To that end, we present our integrated and extensible framework called INDICES for users to conduct a variety of performance benchmarking experiments on multi-tenant servers. The framework also performs centralized data collection for a range of resource usage and application performance statistics in order to model the performance interference and estimate the execution times for the cloud hosted applications.

[1]  Christina Delimitrou,et al.  iBench: Quantifying interference for datacenter applications , 2013, 2013 IEEE International Symposium on Workload Characterization (IISWC).

[2]  Kai Li,et al.  The PARSEC benchmark suite: Characterization and architectural implications , 2008, 2008 International Conference on Parallel Architectures and Compilation Techniques (PACT).

[3]  Babak Falsafi,et al.  Clearing the clouds: a study of emerging scale-out workloads on modern hardware , 2012, ASPLOS XVII.

[4]  Aniruddha S. Gokhale,et al.  INDICES: Exploiting Edge Resources for Performance-Aware Cloud-Hosted Services , 2017, 2017 IEEE 1st International Conference on Fog and Edge Computing (ICFEC).

[5]  Adam Silberstein,et al.  Benchmarking cloud serving systems with YCSB , 2010, SoCC '10.