The Effectiveness of Automated Static Analysis Tools for Fault Detection and Refactoring Prediction

Many automated static analysis (ASA) tools have been developed in recent years for detecting software anomalies. The aim of these tools is to help developers to eliminate software defects at early stages and produce more reliable software at a lower cost. Determining the effectiveness of ASA tools requires empirical evaluation. This study evaluates coding concerns reported by three ASA tools on two open source software (OSS) projects with respect to two types of modifications performed in the studied software CVS repositories: corrections of faults that caused failures, and refactoring modifications. The results show that fewer than 3% of the detected faults correspond to the coding concerns reported by the ASA tools. ASA tools were more effective in identifying refactoring modifications and corresponded to about 71% of them. More than 96% of the coding concerns were false positives that do not relate to any fault or refactoring modification.

[1]  Laurie A. Williams,et al.  On the value of static analysis for fault detection in software , 2006, IEEE Transactions on Software Engineering.

[2]  N. Nagappan,et al.  Static analysis tools as early indicators of pre-release defect density , 2005, Proceedings. 27th International Conference on Software Engineering, 2005. ICSE 2005..

[3]  Shari Lawrence Pfleeger,et al.  Preliminary Guidelines for Empirical Research in Software Engineering , 2002, IEEE Trans. Software Eng..

[4]  Jeffrey S. Foster,et al.  A comparison of bug finding tools for Java , 2004, 15th International Symposium on Software Reliability Engineering.

[5]  Suraj C. Kothari,et al.  A Pattern-Based Framework for Software Anomaly Detection , 2004, Software Quality Journal.

[6]  Adam A. Porter,et al.  Experimental Software Engineering: A Report on the State of the Art , 1995, 1995 17th International Conference on Software Engineering.

[7]  Inderpal S. Bhandari,et al.  Orthogonal Defect Classification - A Concept for In-Process Measurements , 1992, IEEE Trans. Software Eng..

[8]  Automated Software Inspection a New Approach to Increased Software Quality and Productivity , .

[9]  Victor R. Basili,et al.  A Methodology for Collecting Valid Software Engineering Data , 1984, IEEE Transactions on Software Engineering.

[10]  Claes Wohlin,et al.  Experimentation in software engineering: an introduction , 2000 .

[11]  Victor R. Basili,et al.  The TAME Project: Towards Improvement-Oriented Software Environments , 1988, IEEE Trans. Software Eng..

[12]  James M. Bieman,et al.  Coding Concerns: Do They Matter? , 2002 .

[13]  Kevin Crowston,et al.  Information systems success in free and open source software development: theory and measures , 2006, Softw. Process. Improv. Pract..