Evaluating distributed inspection through controlled experiments

Inspection methods can be classified according to their discipline and flexibility. The discipline concerns the formal aspect of an inspection method, whereas the flexibility is strongly related to the simplicity of organising and conducting a meeting. The majority of the available distributed inspection methods have a high level of discipline and flexibility as they are based on a well-defined process and the discussion among team members is easily organised and conducted. In this study the authors present two controlled experiments to evaluate the effectiveness and the efficacy of a distributed inspection process to discover defects within source code. In particular, the first experiment compares the distributed inspection method proposed to a disciplined but not flexible method (i.e. the Fagan's inspection process). In the second experiment the authors investigate differences between the same distributed inspection method and a flexible but not disciplined method (i.e. the pair inspection method). Data analysis reveals that more flexible methods require less time to inspect a software artefact, while the discipline level does not affect the inspection quality.

[1]  Shari Lawrence Pfleeger,et al.  Preliminary Guidelines for Empirical Research in Software Engineering , 2002, IEEE Trans. Software Eng..

[2]  Philip M. Johnson An instrumented approach to improving software quality through formal technical review , 1994, Proceedings of 16th International Conference on Software Engineering.

[3]  John C. Knight,et al.  Phased inspections and their implementation , 1991, SOEN.

[4]  Lasse Harjumaa,et al.  Experiences of painless improvements in software inspection , 1999 .

[5]  Bernd Brügge,et al.  Supporting Distributed Software Development with fine-grained Artefact Management , 2006, 2006 IEEE International Conference on Global Software Engineering (ICGSE'06).

[6]  James Miller,et al.  A process for asynchronous software inspection , 1997, Proceedings Eighth IEEE International Workshop on Software Technology and Engineering Practice incorporating Computer Aided Software Engineering.

[7]  Lasse Harjumaa,et al.  Software inspection - a blend of discipline and flexibility , 1998 .

[8]  Massimiliano Di Penta,et al.  An experimental investigation of formality in UML-based development , 2005, IEEE Transactions on Software Engineering.

[9]  Wei-Tek Tsai,et al.  Distributed, collaborative software inspection , 1993, IEEE Software.

[10]  Vahid Mashayekhi,et al.  CAIS: collaborative asynchronous inspection of software , 1994, SIGSOFT '94.

[11]  Oliver Laitenberger,et al.  An encompassing life cycle centric survey of software inspection , 2000, J. Syst. Softw..

[12]  Victor R. Basili,et al.  Experimentation in software engineering , 1986, IEEE Transactions on Software Engineering.

[13]  Winifred Menezes,et al.  Marketing Technology to Software Practitioners , 2000, IEEE Softw..

[14]  J. B. Iniesta A Tool And A Set Of Metrics To SupportTechnical Reviews , 1970 .

[15]  Chris Sauer,et al.  Technical Reviews: A Behaviorally Motivated Program of Research , 2022 .

[16]  Claes Wohlin,et al.  State‐of‐the‐art: software inspections after 25 years , 2002, Softw. Test. Verification Reliab..

[17]  Vahid Mashayekhi,et al.  A Case Study of Distributed, Asynchronous Software Inspection , 1997, Proceedings of the (19th) International Conference on Software Engineering.

[18]  Genny Tortora,et al.  ADAMS: an Artefact-based Process Support System , 2004, SEKE.

[19]  Daniela E. Damian,et al.  On the Need for Mixed Media in Distributed Requirements Negotiations , 2008, IEEE Transactions on Software Engineering.

[20]  Michael Fagan Design and Code Inspections to Reduce Errors in Program Development , 1976, IBM Syst. J..

[21]  J. Miller,et al.  A Comparison of Tool-Based and Paper-Based Software Inspection , 1998, Empirical Software Engineering.

[22]  Gérard Memmi,et al.  Scrutiny: A Collaborative Inspection and Review System , 1993, ESEC.

[23]  Michael E. Fagan Design and Code Inspections to Reduce Errors in Program Development , 1976, IBM Syst. J..

[24]  Filippo Lanubile,et al.  Tool support for geographically dispersed inspection teams , 2003, Softw. Process. Improv. Pract..

[25]  Gerald M. Weinberg,et al.  Handbook of Walkthroughs, Inspections, and Technical Reviews: Evaluating Programs, Projects, and Products , 1990 .

[26]  Laurence Brothers,et al.  ICICLE: groupware for code inspection , 1990, CSCW '90.

[27]  Claes Wohlin,et al.  Experimentation in Software Engineering , 2000, The Kluwer International Series in Software Engineering.

[28]  Claes Wohlin,et al.  Experimentation in software engineering: an introduction , 2000 .

[29]  John C. Knight,et al.  An improved inspection technique , 1993, CACM.

[30]  William E. Riddle,et al.  Software technology maturation , 1985, ICSE '85.

[31]  Takuya Yamashita,et al.  Evaluation of Jupiter: A Lightweight Code Review Framework , 2006 .

[32]  Sami Kollanus,et al.  Survey of Software Inspection Research: 1991-2005 , 2007 .

[33]  Diane K. Michelson,et al.  Applied Statistics for Engineers and Scientists , 2001, Technometrics.

[34]  Adam A. Porter,et al.  Anywhere, Anytime Code Inspections: Using the Web to Remove Inspection Bottlenecks in Large-Scale Software Development , 1997, Proceedings of the (19th) International Conference on Software Engineering.

[35]  Peter M. Chisnall,et al.  Questionnaire Design, Interviewing and Attitude Measurement , 1993 .

[36]  Giuseppe Scanniello,et al.  Integrating a Distributed Inspection Tool Within an Artefact Management System , 2007, ICSOFT.