Comparison of different impact analysis methods and programmer's opinion: an empirical study

In change impact analysis, obtaining guidance from automatic tools would be highly desirable since this activity is generally seen as a very difficult program comprehension problem. However, since the notion of an 'impact set' (or dependency set) of a specific change is usually very inexact and context dependent, the approaches and algorithms for computing these sets are also very diverse producing quite different results. The question 'which algorithm finds program dependencies in the most efficient way?' has been preoccupying researchers for a long time, but there are still very few results published on the comparison of the different algorithms to what programmers think are real dependencies. In this work, we report on our experiment conducted with this goal in mind using a compact, easily comprehensible Java experimental software system, simulated program changes, and a group of programmers who were asked to perform impact analysis with the help of different tools and on the basis of their programming experience. We show which algorithms turned out to be the closest to the programmers' opinion in this case study. However, the results also certified that most existing algorithms need to be further enhanced and an effective methodology to use automated tools to support impact analysis still needs to be found.

[1]  Andreas Zeller,et al.  Mining Version Histories to Guide Software Changes , 2004 .

[2]  Gregg Rothermel,et al.  An empirical comparison of dynamic impact analysis algorithms , 2004, Proceedings. 26th International Conference on Software Engineering.

[3]  Richard C. Holt,et al.  Predicting change propagation in software systems , 2004, 20th IEEE International Conference on Software Maintenance, 2004. Proceedings..

[4]  Shawn A. Bohner,et al.  Impact analysis in the software change process: a year 2000 perspective , 1996, 1996 Proceedings of International Conference on Software Maintenance.

[5]  Tibor Gyimóthy,et al.  Static Execute After/Before as a replacement of traditional software dependencies , 2008, 2008 IEEE International Conference on Software Maintenance.

[6]  Jacob Cohen A Coefficient of Agreement for Nominal Scales , 1960 .

[7]  Gregg Rothermel,et al.  A comparative study of coarse- and fine-grained safe regression test-selection techniques , 2001, TSEM.

[8]  Václav Rajlich,et al.  Intensions are a key to program comprehension , 2009, 2009 IEEE 17th International Conference on Program Comprehension.

[9]  Standard Glossary of Software Engineering Terminology , 1990 .

[10]  Tibor Gyimóthy,et al.  Computation of Static Execute After Relation with Applications to Software Maintenance , 2007, 2007 IEEE International Conference on Software Maintenance.

[11]  Václav Rajlich,et al.  JRipples: a tool for program comprehension during incremental change , 2005, 13th International Workshop on Program Comprehension (IWPC'05).

[12]  Robert S. Arnold,et al.  Software Change Impact Analysis , 1996 .

[13]  Václav Rajlich,et al.  Incremental change in object-oriented programming , 2004, IEEE Software.

[14]  Tibor Gyimóthy,et al.  Towards a Benchmark for Evaluating Design Pattern Miner Tools , 2008, 2008 12th European Conference on Software Maintenance and Reengineering.

[15]  Daniel M. Germán,et al.  Change impact graphs: Determining the impact of prior codechanges , 2009, Inf. Softw. Technol..

[16]  Mikael Lindvall,et al.  How well do experienced software developers predict software change? , 1998, J. Syst. Softw..

[17]  Barbara G. Ryder,et al.  Constructing the Call Graph of a Program , 1979, IEEE Transactions on Software Engineering.

[18]  John J. Marciniak,et al.  Encyclopedia of Software Engineering , 1994, Encyclopedia of Software Engineering.

[19]  Tibor Gyimóthy,et al.  Towards a Benchmark for Evaluating Reverse Engineering Tools , 2008, 2008 15th Working Conference on Reverse Engineering.

[20]  Lori L. Pollock,et al.  A comparison of online and dynamic impact analysis algorithms , 2005, Ninth European Conference on Software Maintenance and Reengineering.

[21]  H. D. Rombach,et al.  The Goal Question Metric Approach , 1994 .

[22]  Rudolf Ferenc,et al.  BEFRIEND - a benchmark for evaluating reverse engineering tools , 2008 .

[23]  Shawn A. Bohner,et al.  Impact analysis-Towards a framework for comparison , 1993, 1993 Conference on Software Maintenance.

[24]  David W. Binkley,et al.  Interprocedural slicing using dependence graphs , 1990, TOPL.

[25]  Giuliano Antoniol,et al.  Detecting groups of co-changing files in CVS repositories , 2005, Eighth International Workshop on Principles of Software Evolution (IWPSE'05).

[26]  Shari Lawrence Pfleeger,et al.  Preliminary Guidelines for Empirical Research in Software Engineering , 2002, IEEE Trans. Software Eng..