Assisting software engineering students in analyzing their performance in software development

Collecting product and process measures in software development projects, particularly in education and training environments, is important as a basis for assessing current performance and opportunities for improvement. However, analyzing the collected data manually is challenging because of the expertise required, the lack of benchmarks for comparison, the amount of data to analyze, and the time required to do the analysis. ProcessPAIR is a novel tool for automated performance analysis and improvement recommendation; based on a performance model calibrated from the performance data of many developers, it automatically identifies and ranks potential performance problems and root causes of individual developers. In education and training environments, it increases students’ autonomy and reduces instructors’ effort in grading and feedback. In this article, we present the results of a controlled experiment involving 61 software engineering master students, half of whom used ProcessPAIR in a Personal Software Process (PSP) performance analysis assignment, and the other half used a traditional PSP support tool (Process Dashboard) for performing the same assignment. The results show significant benefits in terms of students’ satisfaction (average score of 4.78 in a 1–5 scale for ProcessPAIR users, against 3.81 for Process Dashboard users), quality of the analysis outcomes (average grades achieved of 88.1 in a 0–100 scale for ProcessPAIR users, against 82.5 for Process Dashboard users), and time required to do the analysis (average of 252 min for ProcessPAIR users, against 262 min for Process Dashboard users, but with much room for improvement).

[1]  He Zhang,et al.  Can Software Engineering Students Program Defect-Free? An Educational Approach , 2016, 2016 IEEE/ACM 38th International Conference on Software Engineering Companion (ICSE-C).

[2]  Amela Karahasanovic,et al.  A survey of controlled experiments in software engineering , 2005, IEEE Transactions on Software Engineering.

[3]  Dietmar Pfahl,et al.  Reporting Experiments in Software Engineering , 2008, Guide to Advanced Empirical Software Engineering.

[4]  Sakgasit Ramingwong,et al.  WBPS: A new web based tool for Personal Software Process , 2014, 2014 11th International Conference on Electrical Engineering/Electronics, Computer, Telecommunications and Information Technology (ECTI-CON).

[5]  Mauricio A. Saca Refactoring improving the design of existing code , 2017, 2017 IEEE 37th Central America and Panama Convention (CONCAPAN XXXVII).

[6]  W. Shadish,et al.  Experimental and Quasi-Experimental Designs for Generalized Causal Inference , 2001 .

[7]  Tiago L. Alves,et al.  Deriving metric thresholds from benchmark data , 2010, 2010 IEEE International Conference on Software Maintenance.

[8]  Wei-Yin Loh,et al.  Classification and regression trees , 2011, WIREs Data Mining Knowl. Discov..

[9]  Paul Montgomery,et al.  Experimental and Quasi-Experimental Designs , 2011 .

[10]  Margaret M. Burnett,et al.  A practical guide to controlled experiments of software engineering tools with human participants , 2013, Empirical Software Engineering.

[11]  João Pascoal Faria,et al.  ProcessPAIR: A tool for automated performance analysis and improvement recommendation in software development , 2016, 2016 31st IEEE/ACM International Conference on Automated Software Engineering (ASE).

[12]  Mushtaq Raza Automated Software Process Performance Analysis and Improvement Recommendation , 2017 .

[13]  João Pascoal Faria,et al.  Empirical Evaluation of the ProcessPAIR Tool for Automated Performance Analysis , 2016, SEKE.

[14]  Tiago L. Alves Benchmark-based software product quality evaluation , 2012 .

[15]  Ho-Jin Choi,et al.  Jasmine: A PSP Supporting Tool , 2007, ICSP.

[16]  Watts S. Humphrey,et al.  The Software Quality Profile , 1998 .

[17]  Joseph A. C. Delaney Sensitivity analysis , 2018, The African Continental Free Trade Area: Economic and Distributional Effects.

[18]  João Pascoal Faria,et al.  Helping Software Engineering Students Analyzing Their Performance Data: Tool Support in an Educational Environment , 2017, 2017 IEEE/ACM 39th International Conference on Software Engineering Companion (ICSE-C).

[19]  Vive Kumar,et al.  An Approach to Measure Coding Competency Evolution , 2015 .

[20]  Kent L. Beck,et al.  Extreme programming explained - embrace change , 1990 .

[21]  Victor R. Basili The Role of Controlled Experiments in Software Engineering Research , 2006, Empirical Software Engineering Issues.

[22]  Bernd Brügge,et al.  Metrics in Agile Project Courses , 2016, 2016 IEEE/ACM 38th International Conference on Software Engineering Companion (ICSE-C).

[23]  João Pascoal Faria,et al.  A model for analyzing performance problems and root causes in the personal software process , 2016, J. Softw. Evol. Process..