Identifying algorithmic vulnerabilities through simulated annealing

Real-time software systems with tight performance requirements are abundant. These systems frequently use many different algorithms and if any one of these algorithms was to experience behavior that is atypical because of the input, the entire system may not be able to meet its performance requirements. Unfortunately, it is algorithmically intractable, if not unsolvable, to find the inputs which would cause worst-case behavior. If inputs can be identified that make the system take, say, ten times longer compared to the time it usually takes, that information is valuable for some systems. In this paper, we present a method for finding inputs that perform much worse than the average input to different algorithms. We use the simulated annealing heuristic search method and show that this method is successful in finding worst-case inputs to several sorting algorithms, using several measures of an algorithm’s runtime.

[1]  Panos M. Pardalos,et al.  Handbook of applied optimization , 2002 .

[2]  Dan S. Wallach,et al.  Denial of Service via Algorithmic Complexity Attacks , 2003, USENIX Security Symposium.

[3]  C. D. Gelatt,et al.  Optimization by Simulated Annealing , 1983, Science.

[4]  Joachim Wegener,et al.  Testing the temporal behavior of real-time tasks using extended evolutionary algorithms , 1999, Proceedings 20th IEEE Real-Time Systems Symposium (Cat. No.99CB37054).

[5]  Peter P. Puschner,et al.  Testing the results of static worst-case execution-time analysis , 1998, Proceedings 19th IEEE Real-Time Systems Symposium (Cat. No.98CB36279).

[6]  Mark Harman,et al.  The Current State and Future of Search Based Software Engineering , 2007, Future of Software Engineering (FOSE '07).

[7]  C. Reeves Modern heuristic techniques for combinatorial problems , 1993 .

[8]  John A. Clark,et al.  Automated test‐data generation for exception conditions , 2000 .

[9]  Phil McMinn,et al.  Search‐based software test data generation: a survey , 2004, Softw. Test. Verification Reliab..

[10]  Panos M. Pardalos,et al.  Simulated Annealing and Genetic Algorithms for the Facility Layout Problem: A Survey , 1997, Comput. Optim. Appl..

[11]  Phil McMinn,et al.  Search-based software test data generation: a survey: Research Articles , 2004 .

[12]  John A. Clark,et al.  The Way Forward for Unifying Dynamic Test Case Generation: The Optimisation-based Approach , 1998 .

[13]  Panos M. Pardalos,et al.  Improving the Neighborhood Selection Strategy in Simulated Annealing using the Optimal Stopping Problem , 2008 .

[14]  John A. Clark,et al.  Automated program flaw finding using simulated annealing , 1998, ISSTA '98.

[15]  Matthias Grochtmann,et al.  Systematic Testing of Real-Time Systems , 2000 .

[16]  Nicolai M. Josuttis The C++ Standard Library: A Tutorial and Reference , 2012 .

[17]  Elizabeth L. White,et al.  Software visualization of LR parsing and synthesized attribute evaluation , 1999 .

[18]  M. Douglas McIlroy A Killer Adversary for Quicksort , 1999, Softw. Pract. Exp..

[19]  Ronald L. Rivest,et al.  Introduction to Algorithms, Second Edition , 2001 .

[20]  Hans-Gerhard Groß,et al.  A prediction system for evolutionary testability applied to dynamic execution time analysis , 2001, Inf. Softw. Technol..

[21]  B. F. Jones,et al.  The Automatic Generation Of Software Test Data Sets Using Adaptive Search Techniques , 1970 .