Automated test case generation to validate non-functional software requirements

A software system is bounded by a set of requirements. Functional requirements describe what the system must do, in terms of inputs, the behavior, and outputs. Non-functional requirements describe how well these functional requirements are satisfied, in terms of qualities or constraints on the design or implementation of a system. Both requirements are integral parts of software design specification, and should be revisited constantly during all software development phases. In practice, however, technique support for validating these requirements, does not receive equal emphasis. Techniques for validating functional requirements target all levels of software testing phases, and explore both black box and white box approaches. Techniques for validating non-functional requirements, on the other hand, largely operate in a black box manner, and only focus on system testing level. As a result, most software companies put more efforts in validating functional requirements, and only assess non-functional requirements after functional validation is complete. We propose a set of exhaustive white-box testing techniques that enable cost-effective validation of non-functional requirements from two perspectives. For non-functional requirements defined as qualities of a system, we targeted load testing for the purpose of performance validation. We present a load test suite generation approach that uses symbolic execution to exhaustively traverse program execution paths and produce test cases for the ones that expose worst-case resource consumption scenarios. An assessment of the approach on a set of Java applications shows it generates test suites that induce program response times and memory consumption several times worse than the compared alternatives, it scales to large and complex inputs, and it exposes a diversity of resource consuming program behavior. For non-functional requirements defined as constraints on a system, we present an approach for validating contextual constraints that are imposed by external resources with which the software interacts. The approach amplifies existing tests in an exhaustive manner to validate exception handling constructs that are used to handle such constraints. Our assessment of the approach on a set of Android mobile applications indicates that it can be fully automated, is powerful enough to detect 65% of the faults reported in the bug reports of this kind, and is precise enough that 77% of the detected anomalies correspond to faults fixed by the developers. Combined, the two proposed techniques complement the field of automated software testing by providing exhaustive support for non-functional validation. In this proposal we will discuss the completed work as well as the work yet to be done. Once completed, this research will provide benefits for both researchers and practitioners. For researchers, our work paves the way for research on developing precise white box techniques for validating nonfunctional requirements. For practitioners, our approach provides support for generating non-functional test cases at all levels of software testing phases, especially at the unit testing level, where current tool support for non-functional validation is scarce.

[1]  Dave Taylor Sams Teach Yourself UNIX System Administration in 24 Hours , 2002 .

[2]  Myron Hlynka,et al.  Queueing Networks and Markov Chains (Modeling and Performance Evaluation With Computer Science Applications) , 2007, Technometrics.

[3]  Rina Dechter,et al.  Constraint Processing , 1995, Lecture Notes in Computer Science.

[4]  Elaine J. Weyuker,et al.  The Automatic Generation of Load Test Suites and the Assessment of the Resulting Software , 1995, IEEE Trans. Software Eng..

[5]  Jong-Deok Choi,et al.  Efficient and precise modeling of exceptions for the analysis of Java programs , 1999, PASTE '99.

[6]  David S. Munro,et al.  In: Software-Practice and Experience , 2000 .

[7]  Dirk Beyer CCVisu: automatic visual software decomposition , 2008, ICSE Companion '08.

[8]  Orna Grumberg,et al.  Model checking and modular verification , 1994, TOPL.

[9]  Daniel P. Siewiorek,et al.  FIAT-fault injection based automated testing environment , 1988, [1988] The Eighteenth International Symposium on Fault-Tolerant Computing. Digest of Papers.

[10]  Mira Mezini,et al.  Heuristic Strategies for Recommendation of Exception Handling Code , 2012, 2012 26th Brazilian Symposium on Software Engineering.

[11]  Dongmei Zhang,et al.  Performance debugging in the large via mining millions of stack traces , 2012, 2012 34th International Conference on Software Engineering (ICSE).

[12]  Matthew B. Dwyer,et al.  Automatic generation of load tests , 2011, 2011 26th IEEE/ACM International Conference on Automated Software Engineering (ASE 2011).

[13]  Thomas W. Reps,et al.  Demand interprocedural dataflow analysis , 1995, SIGSOFT FSE.

[14]  Mike Mannion,et al.  SMART requirements , 1995, SOEN.

[15]  Wan Mohd. Nasir Wan Kadir,et al.  Comparative Analysis of Software Performance Prediction Approaches in Context of Component-based System , 2011 .

[16]  Matthew B. Dwyer,et al.  Saturation-based testing of concurrent programs , 2009, ESEC/FSE '09.

[17]  Simon Goldsmith,et al.  Measuring empirical computational complexity , 2007, ESEC-FSE '07.

[18]  Matthew B. Dwyer,et al.  Compositional load test generation for software pipelines , 2012, ISSTA 2012.

[19]  Nikolaj Bjørner,et al.  Z3: An Efficient SMT Solver , 2008, TACAS.

[20]  Steve Freeman,et al.  Evolving an embedded domain-specific language in Java , 2006, OOPSLA '06.

[21]  Paul Clements,et al.  Software architecture in practice , 1999, SEI series in software engineering.

[22]  Nikolai Tillmann,et al.  Demand-Driven Compositional Symbolic Execution , 2008, TACAS.

[23]  Rupak Majumdar,et al.  Dynamic test input generation for database applications , 2007, ISSTA '07.

[24]  Sarfraz Khurshid,et al.  Korat: automated testing based on Java predicates , 2002, ISSTA '02.

[25]  João W. Cangussu,et al.  Automatic feedback, control-based, stress and load testing , 2008, SAC '08.

[26]  Matthew B. Dwyer,et al.  Carving and Replaying Differential Unit Test Cases from System Test Cases , 2009, IEEE Transactions on Software Engineering.

[27]  Daniel Jackson,et al.  Elements of style: analyzing a software design feature with a counterexample detector , 1996, ISSTA '96.

[28]  Steve Freeman,et al.  Mock roles, objects , 2004, OOPSLA '04.

[29]  Martin Glinz,et al.  Rethinking the Notion of Non-Functional Requirements , 2005 .

[30]  Matthias Hauswirth,et al.  Algorithmic profiling , 2012, PLDI.

[31]  Koushik Sen,et al.  DART: directed automated random testing , 2005, PLDI '05.

[32]  Koushik Sen,et al.  WISE: Automated test generation for worst-case complexity , 2009, 2009 IEEE 31st International Conference on Software Engineering.

[33]  David Grove,et al.  Call graph construction in object-oriented languages , 1997, OOPSLA '97.

[34]  Alessandro Orso,et al.  Automated Support for Development, Maintenance, and Testing in the Presence of Implicit Control Flow , 2004, ICSE.

[35]  Toby Walsh,et al.  Permutation Problems and Channelling Constraints , 2001, LPAR.

[36]  Corina S. Pasareanu,et al.  Test input generation for java containers using state matching , 2006, ISSTA '06.

[37]  Tevfik Bultan,et al.  Interface Grammars for Modular Software Model Checking , 2007, IEEE Transactions on Software Engineering.

[38]  Thomas W. Reps,et al.  Precise interprocedural chopping , 1995, SIGSOFT FSE.

[39]  David L. Dill,et al.  A Decision Procedure for Bit-Vectors and Arrays , 2007, CAV.

[40]  Corina S. Pasareanu,et al.  Symbolic execution with mixed concrete-symbolic solving , 2011, ISSTA '11.

[41]  Sebastian G. Elbaum,et al.  Amplifying tests to validate exception handling code , 2012, 2012 34th International Conference on Software Engineering (ICSE).

[42]  DONALD MICHIE,et al.  “Memo” Functions and Machine Learning , 1968, Nature.

[43]  Martin P. Robillard,et al.  Static analysis to support the evolution of exception structure in object-oriented systems , 2003, TSEM.

[44]  S. Mohan,et al.  Performance Solutions: A Practical Guide to Creating Responsive, Scalable Software [Book Review] , 2003, IEEE Software.

[45]  Gunter Saake,et al.  Predicting performance via automated feature-interaction detection , 2012, 2012 34th International Conference on Software Engineering (ICSE).

[46]  Paola Inverardi,et al.  Model-based performance prediction in software development: a survey , 2004, IEEE Transactions on Software Engineering.

[47]  Nicholas Nethercote,et al.  Using Valgrind to Detect Undefined Value Errors with Bit-Precision , 2005, USENIX Annual Technical Conference, General Track.

[48]  Chen Fu,et al.  Exception-Chain Analysis: Revealing Exception Handling Architecture in Java Server Applications , 2007, 29th International Conference on Software Engineering (ICSE'07).

[49]  Joost-Pieter Katoen,et al.  Process algebra for performance evaluation , 2002, Theor. Comput. Sci..

[50]  Michael R. Lowry,et al.  Combining unit-level symbolic execution and system-level concrete execution for testing nasa software , 2008, ISSTA '08.

[51]  Alex Groce,et al.  Model checking Java programs using structural heuristics , 2002, ISSTA '02.

[52]  Westley Weimer,et al.  Automatic documentation inference for exceptions , 2008, ISSTA '08.

[53]  Jimmy Ho-Man Lee,et al.  Automatic Generation of Redundant Models for Permutation Constraint Satisfaction Problems , 2007, Constraints.

[54]  George C. Necula,et al.  Mining Temporal Specifications for Error Detection , 2005, TACAS.

[55]  Zhendong Su,et al.  Synthesizing method sequences for high-coverage testing , 2011, OOPSLA '11.

[56]  Jian Lu,et al.  Environmental Modeling for Automated Cloud Application Testing , 2012, IEEE Software.

[57]  James C. King,et al.  Symbolic execution and program testing , 1976, CACM.

[58]  Chen Fu,et al.  Automatically finding performance problems with feedback-directed learning software testing , 2012, 2012 34th International Conference on Software Engineering (ICSE).

[59]  Corina S. Pasareanu,et al.  Symbolic execution with abstraction , 2008, International Journal on Software Tools for Technology Transfer.

[60]  Richard Gerber,et al.  Compositional verification by model checking for counter-examples , 1996, ISSTA '96.

[61]  Frank Tip,et al.  Finding bugs in dynamic web applications , 2008, ISSTA '08.

[62]  Emily Halili,et al.  Apache JMeter , 2008 .

[63]  Saurabh Sinha,et al.  Analysis and Testing of Programs with Exception Handling Constructs , 2000, IEEE Trans. Software Eng..

[64]  James M. Bieman,et al.  Using fault injection to increase software test coverage , 1996, Proceedings of ISSRE '96: 7th International Symposium on Software Reliability Engineering.

[65]  Michael D. Ernst,et al.  Continuous testing in Eclipse , 2005, Proceedings. 27th International Conference on Software Engineering, 2005. ICSE 2005..

[66]  David S. Rosenblum,et al.  Context-Aware Adaptive Applications: Fault Patterns and Their Automated Identification , 2010, IEEE Transactions on Software Engineering.

[67]  Nikolai Tillmann,et al.  Pex-White Box Test Generation for .NET , 2008, TAP.

[68]  Sarfraz Khurshid,et al.  Generalized Symbolic Execution for Model Checking and Testing , 2003, TACAS.

[69]  Sarfraz Khurshid,et al.  Software assurance by bounded exhaustive testing , 2004, IEEE Transactions on Software Engineering.

[70]  Sarfraz Khurshid,et al.  Memoized symbolic execution , 2012, ISSTA 2012.

[71]  Myra B. Cohen,et al.  Integration Testing of Software Product Lines Using Compositional Symbolic Execution , 2012, FASE.

[72]  C. V. Ramamoorthy,et al.  Pipeline Architecture , 1977, CSUR.

[73]  Gunter Bolch,et al.  Queueing Networks and Markov Chains - Modeling and Performance Evaluation with Computer Science Applications, Second Edition , 1998 .

[74]  Mahesh Viswanathan,et al.  Incremental state-space exploration for programs with dynamically allocated data , 2008, 2008 ACM/IEEE 30th International Conference on Software Engineering.

[75]  Dawson R. Engler,et al.  KLEE: Unassisted and Automatic Generation of High-Coverage Tests for Complex Systems Programs , 2008, OSDI.

[76]  Stephen McCamant,et al.  Loop-extended symbolic execution on binary programs , 2009, ISSTA.

[77]  S. R. Schach,et al.  Using automatic program decomposition techniques in software maintenance tools , 1989, Proceedings. Conference on Software Maintenance - 1989.

[78]  Chen Fu,et al.  Testing of java web services for robustness , 2004, ISSTA '04.

[79]  Patrice Godefroid,et al.  Software partitioning for effective automated unit testing , 2006, EMSOFT '06.

[80]  Chandra Krintz,et al.  Adaptive code unloading for resource-constrained JVMs , 2004, LCTES '04.

[81]  Adam Kiezun,et al.  Grammar-based whitebox fuzzing , 2008, PLDI '08.

[82]  Michael D. Ernst,et al.  Reducing wasted development time via continuous testing , 2003, 14th International Symposium on Software Reliability Engineering, 2003. ISSRE 2003..

[83]  Vikram S. Adve,et al.  LLVM: a compilation framework for lifelong program analysis & transformation , 2004, International Symposium on Code Generation and Optimization, 2004. CGO 2004..

[84]  Koushik Sen,et al.  CUTE: a concolic unit testing engine for C , 2005, ESEC/FSE-13.

[85]  Susan L. Graham,et al.  Gprof: A call graph execution profiler , 1982, SIGPLAN '82.

[86]  Tao Xie,et al.  Mining API Error-Handling Specifications from Source Code , 2009, FASE.

[87]  J. A. Hartigan,et al.  A k-means clustering algorithm , 1979 .

[88]  Camil Demetrescu,et al.  Input-Sensitive Profiling , 2012, IEEE Transactions on Software Engineering.

[89]  Patrice Godefroid,et al.  Compositional dynamic test generation , 2007, POPL '07.

[90]  Darko Marinov,et al.  Automated testing of refactoring engines , 2007, ESEC-FSE '07.

[91]  Corina S. Pasareanu,et al.  A survey of new trends in symbolic execution for software testing and analysis , 2009, International Journal on Software Tools for Technology Transfer.

[92]  Chao Wang,et al.  Modular verification of web services using efficient symbolic encoding and summarization , 2008, SIGSOFT '08/FSE-16.

[93]  Dawson R. Engler,et al.  EXE: automatically generating inputs of death , 2006, CCS '06.

[94]  Karem A. Sakallah,et al.  Algorithms for Computing Minimal Unsatisfiable Subsets of Constraints , 2007, Journal of Automated Reasoning.

[95]  Lori L. Pollock,et al.  Towards a structural load testing tool , 1996, ISSTA '96.

[96]  Sumit Gulwani,et al.  SPEED: precise and efficient static estimation of program computational complexity , 2009, POPL '09.

[97]  Nikolai Tillmann,et al.  Fitness-guided path exploration in dynamic symbolic execution , 2009, 2009 IEEE/IFIP International Conference on Dependable Systems & Networks.

[98]  Nicholas Nethercote,et al.  Valgrind: a framework for heavyweight dynamic binary instrumentation , 2007, PLDI '07.

[99]  D. Shannon,et al.  Efficient symbolic execution of strings for validating web applications , 2009, DEFECTS '09.

[100]  Guodong Li,et al.  JST: An automatic test generation tool for industrial Java applications with strings , 2013, 2013 35th International Conference on Software Engineering (ICSE).

[101]  Tao Xie,et al.  Mining exception-handling rules as sequence association rules , 2009, 2009 IEEE 31st International Conference on Software Engineering.

[102]  Dawson R. Engler,et al.  A system and language for building system-specific, static analyses , 2002, PLDI '02.

[103]  Rupak Majumdar,et al.  Directed test generation using symbolic grammars , 2007, ASE.