Automated synthesis of adversarial workloads for network functions

Software network functions promise to simplify the deployment of network services and reduce network operation cost. However, they face the challenge of unpredictable performance. Given this performance variability, it is imperative that during deployment, network operators consider the performance of the NF not only for typical but also adversarial workloads. We contribute a tool that helps solve this challenge: it takes as input the LLVM code of a network function and outputs packet sequences that trigger slow execution paths. Under the covers, it combines directed symbolic execution with a sophisticated cache model to look for execution paths that incur many CPU cycles and involve adversarial memory-access patterns. We used our tool on 11 network functions that implement a variety of data structures and discovered workloads that can in some cases triple latency and cut throughput by 19% relative to typical testing workloads.

[1]  Katerina J. Argyraki,et al.  A Formally Verified NAT , 2017, SIGCOMM.

[2]  Gorka Irazoqui Apecechea,et al.  Systematic Reverse Engineering of Cache Slice Selection in Intel Processors , 2015, 2015 Euromicro Conference on Digital System Design.

[3]  Keqiang He,et al.  PerfSight: Performance Diagnosis for Software Dataplanes , 2015, Internet Measurement Conference.

[4]  Dawson R. Engler,et al.  KLEE: Unassisted and Automatic Generation of High-Coverage Tests for Complex Systems Programs , 2008, OSDI.

[5]  Michael Hicks,et al.  Directed Symbolic Execution , 2011, SAS.

[6]  Aditya Akella,et al.  Paving the Way for NFV: Simplifying Middlebox Modifications Using StateAlyzr , 2016, NSDI.

[7]  Ramesh Govindan,et al.  Finding protocol manipulation attacks , 2011, SIGCOMM.

[8]  Marco Canini,et al.  A SOFT way for openflow switch interoperability testing , 2012, CoNEXT '12.

[9]  Vyas Sekar,et al.  Verifiable resource accounting for cloud computing services , 2011, CCSW '11.

[10]  Daniel Raumer,et al.  MoonGen: A Scriptable High-Speed Packet Generator , 2014, Internet Measurement Conference.

[11]  Isil Dillig,et al.  Detecting and Exploiting Second Order Denial-of-Service Vulnerabilities in Web Applications , 2015, CCS.

[12]  Angelos D. Keromytis,et al.  SlowFuzz: Automated Domain-Independent Detection of Algorithmic Complexity Vulnerabilities , 2017, CCS.

[13]  Marco Canini,et al.  Automating the Testing of OpenFlow Applications , 2011 .

[14]  Vyas Sekar,et al.  Design and Implementation of a Consolidated Middlebox Architecture , 2012, NSDI.

[15]  Koushik Sen,et al.  WISE: Automated test generation for worst-case complexity , 2009, 2009 IEEE 31st International Conference on Software Engineering.

[16]  Wojciech Szpankowski,et al.  Patricia tries again revisited , 1990, JACM.

[17]  Yehuda Afek,et al.  Making DPI Engines Resilient to Algorithmic Complexity Attacks , 2016, IEEE/ACM Transactions on Networking.

[18]  Sophie Cluet,et al.  A general framework for the optimization of object-oriented queries , 1992, SIGMOD '92.

[19]  Somesh Jha,et al.  Backtracking Algorithmic Complexity Attacks against a NIDS , 2006, 2006 22nd Annual Computer Security Applications Conference (ACSAC'06).

[20]  George Ho,et al.  PAPI: A Portable Interface to Hardware Performance Counters , 1999 .

[21]  Patrice Godefroid Test Generation Using Symbolic Execution , 2012, FSTTCS.

[22]  Costin Raiciu,et al.  SymNet: Scalable symbolic execution for modern networks , 2016, SIGCOMM.

[23]  Bor-Yuh Evan Chang,et al.  Boogie: A Modular Reusable Verifier for Object-Oriented Programs , 2005, FMCO.

[24]  David A. Maltz,et al.  Network traffic characteristics of data centers in the wild , 2010, IMC '10.

[25]  C. V. Ramamoorthy,et al.  On the Automated Generation of Program Test Data , 1976, IEEE Transactions on Software Engineering.

[26]  Philippe Oechslin,et al.  Making a Faster Cryptanalytic Time-Memory Trade-Off , 2003, CRYPTO.

[27]  Nils J. Nilsson,et al.  A Formal Basis for the Heuristic Determination of Minimum Cost Paths , 1968, IEEE Trans. Syst. Sci. Cybern..

[28]  Katerina J. Argyraki,et al.  How to Measure the Killer Microsecond , 2017, CCRV.

[29]  Thomas R. Gross,et al.  Synthesizing programs that expose performance bottlenecks , 2018, CGO.

[30]  George Varghese,et al.  Automatic Test Packet Generation , 2012, IEEE/ACM Transactions on Networking.

[31]  Marco Canini,et al.  A NICE Way to Test OpenFlow Applications , 2012, NSDI.

[32]  Katerina J. Argyraki,et al.  Toward Predictable Performance in Software Packet-Processing Platforms , 2012, NSDI.

[33]  Dan S. Wallach,et al.  Denial of Service via Algorithmic Complexity Attacks , 2003, USENIX Security Symposium.

[34]  Peter P. Puschner,et al.  Testing the results of static worst-case execution-time analysis , 1998, Proceedings 19th IEEE Real-Time Systems Symposium (Cat. No.98CB36279).

[35]  Katerina J. Argyraki,et al.  Software dataplane verification , 2014, NSDI.

[36]  James C. King,et al.  Symbolic execution and program testing , 1976, CACM.

[37]  Jakob Engblom,et al.  The worst-case execution-time problem—overview of methods and survey of tools , 2008, TECS.

[38]  Seungjoon Lee,et al.  Network function virtualization: Challenges and opportunities for innovations , 2015, IEEE Communications Magazine.

[39]  Ramesh Govindan,et al.  Analyzing Protocol Implementations for Interoperability , 2015, NSDI.

[40]  Mythili Vutukuru,et al.  NFVPerf: Online performance monitoring and bottleneck detection for NFV , 2016, 2016 IEEE Conference on Network Function Virtualization and Software Defined Networks (NFV-SDN).

[41]  Ying Zhang,et al.  Automatic Synthesis of NF Models by Program Analysis , 2016, HotNets.