A Family of Experiments to Investigate the Influence of Context on the Effect of Inspection Techniques

Abstract For a growing population of researchers in software engineering, empirical studies have become a key approach of re-search. Empirical studies may be used, for example, to evaluate technologies and help to direct further research by revealing what problems and difficulties people have in practice. Without empirical studies, we have to rely only on intuition or educated opinion. Individual empirical studies often yield interesting results for their particular context, but typically this context is not de-scribed in sufficient detail to decide whether another context is similar enough to apply the conclusions of the study also there. We argue instead that families of experiments with a common framework for collecting context data are necessary in order to abstract conclusions at a useful level of detail. This paper describes a method to plan, conduct, and analyze coordinated, or concerted, families of experiments. The goal of the method is to maximize the quality and benefit of the individual empirical studies as part of the family and to mini-mize the effort for researchers by reusing experiment know-how. This is achieved by providing, for all studies of the ex-periment family, a common framework for context measurement, study preparation, material, and analysis. We apply the method to describe the planning steps for an experiment family on the influence of context on the effec-tiveness of defect reduction techniques. We focus on a particular technology, reading techniques for inspections, to in-stantiate this work. The first step of this experiment family is a broad survey of software companies on the state of the practice of inspection process and inspection techniques. The second step is to benchmark state-of-the-art inspection techniques with the par-ticipating organization’s own documents and inspection techniques. Keywords: Empirical Software Engineering, Experiment Family, Software Inspection, Meta Analysis. 1 I

[1]  Andy Brooks,et al.  Meta Analysis—A Silver Bullet—for Meta-Analysts , 1997, Empirical Software Engineering.

[2]  Oliver Laitenberger,et al.  Cost-effective Detection of Software Defects through Perspective-based Inspections , 2001, Empirical Software Engineering.

[3]  Harlan D. Mills,et al.  Structured programming - theory and practice , 1979, The systems programming series.

[4]  Forrest Shull,et al.  The empirical investigation of Perspective-Based Reading , 1995, Empirical Software Engineering.

[5]  Adam A. Porter,et al.  Comparing Detection Methods for Software Requirements Inspections: A Replicated Experiment , 1995, IEEE Trans. Software Eng..

[6]  Oliver Laitenberger,et al.  An encompassing life cycle centric survey of software inspection , 2000, J. Syst. Softw..

[7]  Marvin V. Zelkowitz,et al.  Experimental Models for Validating Technology , 1998, Computer.

[8]  Victor R. Basili,et al.  Experimentation in software engineering , 1986, IEEE Transactions on Software Engineering.

[9]  Chris Sauer,et al.  Technical Reviews: A Behaviorally Motivated Program of Research , 2022 .

[10]  Watts S. Humphrey,et al.  Managing the software process , 1989, The SEI series in software engineering.

[11]  L. Hedges,et al.  Statistical Methods for Meta-Analysis , 1987 .

[12]  Ben Shneiderman,et al.  Perspective-based Usability Inspection: An Empirical Validation of Efficacy , 1999, Empirical Software Engineering.

[13]  Walter F. Tichy,et al.  Should Computer Scientists Experiment More? , 1998, Computer.

[14]  Barry W. Boehm,et al.  COTS-Based Systems Top 10 List , 2001, Computer.

[15]  Marvin V. Zelkowitz,et al.  Experimental validation in software engineering , 1997, Inf. Softw. Technol..

[16]  David Lorge Parnas,et al.  Documentation of requirements for computer systems , 1993, [1993] Proceedings of the IEEE International Symposium on Requirements Engineering.

[17]  Adam A. Porter,et al.  Assessing Software Review Meetings: Results of a Comparative Analysis of Two Experimental Studies , 1997, IEEE Trans. Software Eng..

[18]  Khaled El Emam,et al.  Fraunhofer Institute for Experimental Software Engineering , 1997, Softw. Process. Improv. Pract..

[19]  Claes Wohlin,et al.  Experimentation in Software Engineering , 2000, The Kluwer International Series in Software Engineering.

[20]  Forrest Shull,et al.  Us-ing Experiments to Build a Body of Knowl-edge , 1998 .

[21]  Colin Atkinson,et al.  An experimental comparison of reading techniques for defect detection in UML design documents , 2000, J. Syst. Softw..

[22]  Barry W. Boehm,et al.  Software Defect Reduction Top 10 List , 2001, Computer.

[23]  Andreas Zendler,et al.  A Preliminary Software Engineering Theory as Investigated by Published Experiments , 2001, Empirical Software Engineering.

[24]  Jeffrey C. Carver,et al.  An empirical methodology for introducing software processes , 2001, ESEC/FSE-9.

[25]  Adam A. Porter,et al.  An experiment to assess different defect detection methods for software requirements inspections , 1994, Proceedings of 16th International Conference on Software Engineering.

[26]  Barry Boehm,et al.  Top 10 list [software development] , 2001 .

[27]  Maurizio Morisio,et al.  Investigating and improving a COTS-based software development process , 2000, Proceedings of the 2000 International Conference on Software Engineering. ICSE 2000 the New Millennium.

[28]  Victor R. Basili,et al.  Software process evolution at the SEL , 1994, IEEE Software.

[29]  Victor R. Basili,et al.  A Controlled Experiment Quantitatively Comparing Software Development Approaches , 1981, IEEE Transactions on Software Engineering.

[30]  Claes Wohlin,et al.  Software inspection benchmarking-a qualitative and quantitative comparative opportunity , 2002, Proceedings Eighth IEEE Symposium on Software Metrics.

[31]  Marvin V. Zelkowitz,et al.  SEL's Software Process Improvement Program , 1995, IEEE Softw..

[32]  Khaled El Emam,et al.  An Internally Replicated Quasi-Experimental Comparison of Checklist and Perspective-Based Reading of Code Documents , 2001, IEEE Trans. Software Eng..

[33]  Forrest Shull,et al.  Using Experiments to Build a Body of Knowledge , 1999, Ershov Memorial Conference.

[34]  Audris Mockus,et al.  Understanding the sources of variation in software inspections , 1998, TSEM.

[35]  H. D. Rombach,et al.  THE EXPERIENCE FACTORY , 1999 .

[36]  James Miller,et al.  Applying meta-analytical procedures to software engineering experiments , 2000, J. Syst. Softw..

[37]  Peter J. Middleton,et al.  Software Inspection , 1994, J. Inf. Technol..