Agile metrics for a university software engineering course

Teaching agile software development by pairing lectures with hands-on projects has become the norm. This approach poses the problem of grading and evaluating practical project work as well as process conformance during development. Yet, few best practices exist for measuring the success of students in implementing agile practices. Most university courses rely on observations during the course or final oral exams. In this paper, we propose a set of metrics which give insights into the adherence to agile practices in teams. The metrics identify instances in development data, e.g. commits or user stories, where agile processes were not followed. The identified violations can serve as starting points for further investigation and team discussions. With contextual knowledge of the violation, the executed process or the metric itself can be refined. The metrics reflect our experiences with running a software engineering course over the last five years. They measure aspects which students frequently have issues with and that diminish process adoption and student engagement. We present the proposed metrics, which were tested in the latest course installment, alongside tutoring, lectures, and oral exams.

[1]  Kent L. Beck,et al.  Extreme programming explained - embrace change , 1990 .

[2]  Barry Boehm,et al.  Top 10 list [software development] , 2001 .

[3]  Jeff Sutherland,et al.  The Scrum Guide , 2012 .

[4]  Dietmar Pfahl,et al.  What Do We Know about Scientific Software Development's Agile Practices? , 2012, Computing in Science & Engineering.

[5]  Murray Hill,et al.  Lint, a C Program Checker , 1978 .

[6]  Giovanna Guerrini,et al.  Proceedings of the Joint EDBT/ICDT 2013 Workshops , 2013 .

[7]  Ron Jeffries,et al.  Extreme Programming Installed , 2000 .

[8]  Barry W. Boehm,et al.  Software Defect Reduction Top 10 List , 2001, Computer.

[9]  Mike Cohn,et al.  User Stories Applied: For Agile Software Development , 2004 .

[10]  Lech Madeyski,et al.  Test-Driven Development - An Empirical Evaluation of Agile Practice , 2009 .

[11]  Forrest Shull,et al.  Are developers complying with the process: an XP study , 2010, ESEM '10.

[12]  Christoph Matthies,et al.  How Surveys, Tutors and Software Help to Assess Scrum Adoption in a Classroom Software Engineering Project , 2016, 2016 IEEE/ACM 38th International Conference on Software Engineering Companion (ICSE-C).

[13]  René Peinl,et al.  Performance of graph query languages: comparison of cypher, gremlin and native access in Neo4j , 2013, EDBT '13.

[14]  Philip M. Johnson,et al.  Beyond the Personal Software Process: Metrics collection and analysis for the differently disciplined , 2003, 25th International Conference on Software Engineering, 2003. Proceedings..

[15]  Jeffrey C. Carver,et al.  Identifying the characteristics of vulnerable code changes: an empirical study , 2014, SIGSOFT FSE.

[16]  Marco Torchiano,et al.  On the Difficulty of Computing the Truck Factor , 2011, PROFES.

[17]  Henrik Kniberg,et al.  Scrum and XP from the Trenches: Enterprise Software Development , 2007 .