Expectations, outcomes, and challenges of modern code review

Code review is a common software engineering practice employed both in open source and industrial contexts. Review today is less formal and more “lightweight” than the code inspections performed and studied in the 70s and 80s. We empirically explore the motivations, challenges, and outcomes of tool-based code reviews. We observed, interviewed, and surveyed developers and managers and manually classified hundreds of review comments across diverse teams at Microsoft. Our study reveals that while finding defects remains the main motivation for review, reviews are less about defects than expected and instead provide additional benefits such as knowledge transfer, increased team awareness, and creation of alternative solutions to problems. Moreover, we find that code and change understanding is the key aspect of code reviewing and that developers employ a wide range of mechanisms to meet their understanding needs, most of which are not met by current tools. We provide recommendations for practitioners and researchers.

[1]  Pradeep K. Tyagi The effects of appeals, anonymity, and feedback on mail survey response patterns from salespeople , 1989 .

[2]  Janice Singer,et al.  Studying Software Engineers: Data Collection Techniques for Software Field Studies , 2005, Empirical Software Engineering.

[3]  Vahid Mashayekhi,et al.  A Case Study of Distributed, Asynchronous Software Inspection , 1997, Proceedings of the (19th) International Conference on Software Engineering.

[4]  Nahid Golafshani,et al.  Understanding Reliability and Validity in Qualitative Research , 2003 .

[5]  A. Kuper,et al.  The Social Science Encyclopedia , 1986 .

[6]  Premkumar T. Devanbu,et al.  Open Borders? Immigration in Open Source Projects , 2007, Fourth International Workshop on Mining Software Repositories (MSR'07:ICSE Workshops 2007).

[7]  Michael E. Fagan Design and Code Inspections to Reduce Errors in Program Development , 1976, IBM Syst. J..

[8]  Margaret-Anne Storey,et al.  – Lessons and Recommendations for Closed Source , 2012 .

[9]  A. Twycross Research design: qualitative, quantitative and mixed methods approaches Research design: qualitative, quantitative and mixed methods approaches Creswell John W Sage 320 £29 0761924426 0761924426 [Formula: see text]. , 2004, Nurse researcher.

[10]  Miguel P Caldas,et al.  Research design: qualitative, quantitative, and mixed methods approaches , 2003 .

[11]  Hans-Ludwig Hausen Proc. of a symposium on Software validation: inspection-testing-verification-alternatives , 1984 .

[12]  Laurence Brothers,et al.  ICICLE: groupware for code inspection , 1990, CSCW '90.

[13]  R. Weiss Learning from strangers : the art and method of qualitative interview studies , 1995 .

[14]  Gérard Memmi,et al.  Scrutiny: A Collaborative Inspection and Review System , 1993, ESEC.

[15]  Galen C. Britz Improving Performance Through Statistical Thinking , 2000 .

[16]  Gina Venolia,et al.  Can peer code reviews be exploited for later information needs? , 2009, 2009 31st International Conference on Software Engineering - Companion Volume.

[17]  J. G. Adair,et al.  The Hawthorne effect: A reconsideration of the methodological artifact. , 1984 .

[18]  SeamanCarolyn,et al.  Inspecting the History of Inspections , 2008 .

[19]  Philippe Kruchten,et al.  Using grounded theory to study the experience of software development , 2011, Empirical Software Engineering.

[20]  Nancy Belunis,et al.  Improving Performance Through Statistical Thinking , 2001, Technometrics.

[21]  Forrest Shull,et al.  Inspecting the History of Inspections: An Example of Evidence-Based Technology Diffusion , 2008, IEEE Software.

[22]  B. Berg Qualitative Research Methods for the Social Sciences , 1989 .

[23]  O. Laitenberger A Survey of Software Inspection Technologies , 2001 .

[24]  Margaret-Anne D. Storey,et al.  Understanding broadcast based peer review on open source software projects , 2011, 2011 33rd International Conference on Software Engineering (ICSE).

[25]  Tobias Zimmermann,et al.  Information architecture , 2004, Electron. Libr..

[26]  Justin P. Johnson,et al.  Collaboration, Peer Review and Open Source Software , 2004, Inf. Econ. Policy.

[27]  Thomas D. LaToza,et al.  Maintaining mental models: a study of developer work habits , 2006, ICSE.

[28]  Forrest Shull,et al.  Building Knowledge through Families of Experiments , 1999, IEEE Trans. Software Eng..

[29]  B. Glaser Doing grounded theory : issues and discussions , 1998 .

[30]  Marcus Ciolkowski,et al.  Conducting on-line surveys in software engineering , 2003, 2003 International Symposium on Empirical Software Engineering, 2003. ISESE 2003. Proceedings..

[31]  Daniel M. German,et al.  Open source software peer review practices , 2008, 2008 ACM/IEEE 30th International Conference on Software Engineering.

[32]  Shari Lawrence Pfleeger,et al.  Personal Opinion Surveys , 2008, Guide to Advanced Empirical Software Engineering.

[33]  Lawrence G. Votta,et al.  Does every inspection need a meeting? , 1993, SIGSOFT '93.

[34]  H-U Simon,et al.  Doing Grounded Theory , 2014 .

[35]  J. David Morgenthaler,et al.  Using FindBugs on production software , 2007, OOPSLA '07.

[36]  A. Frank Ackerman,et al.  Software inspections: an effective verification process , 1989, IEEE Software.

[37]  B. Flyvbjerg Five Misunderstandings About Case-Study Research , 2006, 1304.1186.

[38]  Harvey P. Siy,et al.  A Review of Software Inspections , 1995, Adv. Comput..

[39]  Priscilla J. Fowler,et al.  Software inspections and the industrial production of software , 1984 .

[40]  Vahid Mashayekhi,et al.  CAIS: collaborative asynchronous inspection of software , 1994, SIGSOFT '94.

[41]  Daniel M. Germán,et al.  Contemporary Peer Review in Action: Lessons from Open Source Development , 2012, IEEE Software.

[42]  Thomas R. Lindlof Qualitative Communication Research Methods , 1994 .

[43]  Janice Singer,et al.  Guide to Advanced Empirical Software Engineering , 2007 .