A systematic review on the use of Definition of Done on agile software development projects

Background: Definition of Done (DoD) is a Scrum practice that consists of a simple list of criteria that adds verifiable or demonstrable value to the product. It is one of the most popular agile practices and assures a balance between short-term delivery of features and long-term product quality, but little is known of its actual use in Agile teams. Objective: To identify possible gaps in the literature and define a starting point to define DoD for practitioners through the identification and synthesis of the DoD criteria used in agile projects as presented in the scientific literature. Method: We applied a Systematic Literature Review of studies published up to (and including) 2016 through database search and backward and forward snowballing. Results: In total, we evaluated 2326 papers, of which 8 included DoD criteria used in agile projects. We identified that some studies presented up to 4 levels of DoD, which include story, sprint, release or project. We identified 62 done criteria, which are related to software verification and validation, deploy, code inspection, test process quality, regulatory compliance, software architecture design, process management, configuration management and non-functional requirements. Conclusion: The main implication for research is a need for more and better empirical studies documenting and evaluating the use of the DoD in agile software development. For the industry, the review provides a map of how DoD is currently being used in the industry and can be used as a starting point to define or compare with their own DoD definition.

[1]  Mary Shaw,et al.  Writing good software engineering research papers: minitutorial , 2003, ICSE 2003.

[2]  Pearl Brereton,et al.  Performing systematic literature reviews in software engineering , 2006, ICSE.

[3]  Mary Shaw,et al.  Writing good software engineering research papers , 2003, 25th International Conference on Software Engineering, 2003. Proceedings..

[4]  Laurie A. Williams,et al.  What agile teams think of agile principles , 2012, Commun. ACM.

[5]  Pearl Brereton,et al.  Lessons from applying the systematic literature review process within the software engineering domain , 2007, J. Syst. Softw..

[6]  Claes Wohlin,et al.  Guidelines for snowballing in systematic literature studies and a replication in software engineering , 2014, EASE '14.

[7]  Roel Wieringa,et al.  Requirements engineering paper classification and evaluation criteria: a proposal and a discussion , 2005, Requirements Engineering.

[8]  Mark Staples,et al.  Experiences using systematic review guidelines , 2006, J. Syst. Softw..

[9]  Marco Torchiano,et al.  Empirical studies in reverse engineering: state of the art and future trends , 2007, Empirical Software Engineering.

[10]  Emilia Mendes,et al.  Using Forward Snowballing to update Systematic Reviews in Software Engineering , 2016, ESEM.

[11]  Claes Wohlin,et al.  Experiences from using snowballing and database searches in systematic literature studies , 2015, EASE.

[12]  Jeff Sutherland,et al.  The Scrum Guide , 2012 .

[13]  Kieran Conboy,et al.  Obstacles to decision making in Agile software development teams , 2012, J. Syst. Softw..

[14]  Tommi Mikkonen,et al.  Exploring ScrumBut - An empirical study of Scrum anti-patterns , 2016, Inf. Softw. Technol..

[15]  Claes Wohlin,et al.  A systematic literature review on the industrial use of software process simulation , 2014, J. Syst. Softw..

[16]  Tore Dybå,et al.  Empirical studies of agile software development: A systematic review , 2008, Inf. Softw. Technol..