Convergent contemporary software peer review practices

Software peer review is practiced on a diverse set of software projects that have drastically different settings, cultures, incentive systems, and time pressures. In an effort to characterize and understand these differences we examine two Google-led projects, Android and Chromium OS, three Microsoft projects, Bing, Office, and MS SQL, and projects internal to AMD. We contrast our findings with data taken from traditional software inspection conducted on a Lucent project and from open source software peer review on six projects, including Apache, Linux, and KDE. Our measures of interest include the review interval, the number of developers involved in review, and proxy measures for the number of defects found during review. We find that despite differences among projects, many of the characteristics of the review process have independently converged to similar values which we think indicate general principles of code review practice. We also introduce a measure of the degree to which knowledge is shared during review. This is an aspect of review practice that has traditionally only had experiential support. Our knowledge sharing measure shows that conducting peer review increases the number of distinct files a developer knows about by 66% to 150% depending on the project. This paper is one of the first studies of contemporary review in software firms and the most diverse study of peer review to date.

[1]  Lawrence G. Votta,et al.  Does every inspection need a meeting? , 1993, SIGSOFT '93.

[2]  Peter Kampstra,et al.  Beanplot: A Boxplot Alternative for Visual Comparison of Distributions , 2008 .

[3]  Andy Huber,et al.  Peer reviews in software: a practical guide , 2002, SOEN.

[4]  Daniel M. Germán,et al.  Contemporary Peer Review in Action: Lessons from Open Source Development , 2012, IEEE Software.

[5]  Julian G. Ratcliffe Moving Software Quality Upstream: The Positive Impact of Lightweight Peer Code Review , 2009 .

[6]  Chris Sauer,et al.  Technical Reviews: A Behaviorally Motivated Program of Research , 2022 .

[7]  Audris Mockus,et al.  Expertise Browser: a quantitative approach to identifying expertise , 2002, Proceedings of the 24th International Conference on Software Engineering. ICSE 2002.

[8]  Alberto Bacchelli,et al.  Expectations, outcomes, and challenges of modern code review , 2013, 2013 35th International Conference on Software Engineering (ICSE).

[9]  Manfred Broy,et al.  Software Pioneers: Contributions to Software Engineering , 2002 .

[10]  Christian Bird,et al.  Gerrit software code review data from Android , 2013, 2013 10th Working Conference on Mining Software Repositories (MSR).

[11]  Shahedul Huq Khandkar,et al.  The role of patch review in software evolution: an analysis of the mozilla firefox , 2009, IWPSE-Evol '09.

[12]  J. Herbsleb,et al.  Two case studies of open source software development: Apache and Mozilla , 2002, TSEM.

[13]  Edward F. Weller,et al.  Lessons from three years of inspection data (software development) , 1993, IEEE Software.

[14]  Philip M. Johnson Reengineering inspection , 1998, CACM.

[15]  Audris Mockus,et al.  Understanding the sources of variation in software inspections , 1998, TSEM.

[16]  R. Yin Case Study Research: Design and Methods , 1984 .

[17]  Daniel M. German,et al.  Open source software peer review practices , 2008, 2008 ACM/IEEE 30th International Conference on Software Engineering.

[18]  Jai Asundi,et al.  Patch Review Processes in Open Source Software Development Communities: A Comparative Case Study , 2007, 2007 40th Annual Hawaii International Conference on System Sciences (HICSS'07).

[19]  B BisantDavid,et al.  A Two-Person Inspection Method to Improve Programming Productivity , 1989 .

[20]  Michael E. Fagan Design and Code Inspections to Reduce Errors in Program Development , 1976, IBM Syst. J..

[21]  Tom DeMarco,et al.  Controlling Software Projects: Management, Measurement, and Estimates , 1986 .

[22]  D HerbslebJames,et al.  Two case studies of open source software development , 2002 .

[23]  Karl E. Wiegers,et al.  Peer Reviews in Software: A Practical Guide , 2001 .

[24]  Premkumar T. Devanbu,et al.  Detecting Patch Submission and Acceptance in OSS Projects , 2007, Fourth International Workshop on Mining Software Repositories (MSR'07:ICSE Workshops 2007).

[25]  Daniel M. German,et al.  Understanding open source software peer review: Review processes, parameters and statistical models, and underlying behaviours and mechanisms , 2011 .

[26]  Victor R. Basili,et al.  Iterative and incremental developments. a brief history , 2003, Computer.

[27]  Margaret-Anne D. Storey,et al.  Understanding broadcast based peer review on open source software projects , 2011, 2011 33rd International Conference on Software Engineering (ICSE).

[28]  James R. Lyle,et al.  A Two-Person Inspection Method to Improve Prog ramming Productivity , 1989, IEEE Transactions on Software Engineering.

[29]  Andrew Glover,et al.  Continuous Integration: Improving Software Quality and Reducing Risk (The Addison-Wesley Signature Series) , 2007 .