AURORA: AUtomatic RObustness coveRage Analysis Tool

Code coverage is usually used as a measurement of testing quality and as adequacy criterion. Unfortunately, code coverage is very sensitive to modifications of the code structure, and, therefore, we can achieve the same degree of coverage with different testing effort by writing the same program in syntactically different ways. For this reason, code coverage can provide the tester with misleading information. In order to understand how a testing criterion is affected by code structure modifications, we have introduced a way to measure the sensitivity of coverage to code changes by means of code-to-code transformations. However the manual execution of the robustness analysis is tedious, time consuming and error prone. In order to solve these issues we present AURORA, a tool that automates the robustness analysis process and leverages the capabilities offered from several existing tools. AURORA has an extendible architecture that concretely supports the tester in the execution of the robustness analysis. Due to this extendible architecture, each user can personalize the robustness analysis to his/her needs. AURORA allows the user to add new transformations by using TXL, which is a programming language specifically designed to support source transformation tasks. It performs the coverage evaluation by using existing code coverage tools and is based on the use of the JUnit framework.

[1]  Dietmar Pfahl,et al.  Using simulation for assessing the real impact of test-coverage on defect-coverage , 2000, IEEE Trans. Reliab..

[2]  Michael E. Senko A control system for logical block diagnosis with data loading , 1960, CACM.

[3]  Michael R. Lyu,et al.  Achieving software quality with testing coverage measures , 1994, Computer.

[4]  Raimund Kirner Towards Preserving Model Coverage and Structural Code Coverage , 2009, EURASIP J. Embed. Syst..

[5]  Steven P. Miller,et al.  Applicability of modified condition/decision coverage to software testing , 1994, Softw. Eng. J..

[6]  Brian Marick How to Misuse Code Coverage , 1999 .

[7]  Elaine J. Weyuker,et al.  Selecting Software Test Data Using Data Flow Information , 1985, IEEE Transactions on Software Engineering.

[8]  Clifford J. Maloney,et al.  Systematic mistake analysis of digital computer programs , 1963, CACM.

[9]  Raimund Kirner,et al.  Optimizing compilation with preservation of structural code coverage metrics to support software testing , 2014, Softw. Test. Verification Reliab..

[10]  Akbar Siami Namin,et al.  The influence of size and coverage on test suite effectiveness , 2009, ISSTA.

[11]  Glenford J. Myers,et al.  The art of software testing (2. ed.) , 2004 .

[12]  Stephan Weißleder Simulated Satisfaction of Coverage Criteria on UML State Machines , 2010, 2010 Third International Conference on Software Testing, Verification and Validation.

[13]  Fabio Del Frate,et al.  On the correlation between code coverage and software reliability , 1995, Proceedings of Sixth International Symposium on Software Reliability Engineering. ISSRE'95.

[14]  Alfred V. Aho,et al.  Compilers: Principles, Techniques, and Tools , 1986, Addison-Wesley series in computer science / World student series edition.

[15]  Ajitha Rajan,et al.  The effect of program and model structure on mc/dc test adequacy coverage , 2008, 2008 ACM/IEEE 30th International Conference on Software Engineering.

[16]  Gregory Gay,et al.  On the Danger of Coverage Directed Test Case Generation , 2012, FASE.

[17]  Angelo Gargantini,et al.  Extending Coverage Criteria by Evaluating Their Robustness to Code Structure Changes , 2012, ICTSS.

[18]  Edsger W. Dijkstra,et al.  Notes on structured programming , 1970 .

[19]  James R. Cordy,et al.  The TXL source transformation language , 2006, Sci. Comput. Program..

[20]  Mark Harman,et al.  Testability transformation , 2004, IEEE Transactions on Software Engineering.

[21]  James R. Cordy,et al.  Excerpts from the TXL Cookbook , 2009, GTTSE.

[22]  Glenford J. Myers,et al.  Art of Software Testing , 1979 .