A case study of program comprehension effort and technical debt estimations

This paper describes a case study of using developer activity logs as indicators of a program comprehension effort by analyzing temporal sequences of developer actions (e.g., navigation and edit actions). We analyze developer activity data spanning 109,065 events and 69 hours of work on a medium-sized industrial application. We examine potential correlations between different measures of developer activity, code change metrics and code smells to gain insight into questions that could direct future technical debt interest estimation. To gain more insights into the data, we follow our analysis with commit message analysis and a developer interview. Our results indicate that developer activity as an estimate of program comprehension effort is correlated with both change proneness and static metrics for code smells.

[1]  Michele Lanza,et al.  I know what you did last summer: an investigation of how developers spend their time , 2015, ICPC '15.

[2]  Daniela Cruzes,et al.  The evolution and impact of code smells: A case study of two open source systems , 2009, 2009 3rd International Symposium on Empirical Software Engineering and Measurement.

[3]  Robert L. Nord,et al.  Technical Debt: From Metaphor to Theory and Practice , 2012, IEEE Software.

[4]  C. F. Kossack,et al.  Rank Correlation Methods , 1949 .

[5]  J. Tukey The Philosophy of Multiple Comparisons , 1991 .

[6]  Foutse Khomh,et al.  An Exploratory Study of the Impact of Code Smells on Software Change-proneness , 2009, 2009 16th Working Conference on Reverse Engineering.

[7]  Brad A. Myers,et al.  An Exploratory Study of How Developers Seek, Relate, and Collect Relevant Information during Software Maintenance Tasks , 2006, IEEE Transactions on Software Engineering.

[8]  Nicholas A. Kraft,et al.  A Framework for Estimating Interest on Technical Debt by Monitoring Developer Activity Related to Code Comprehension , 2014, 2014 Sixth International Workshop on Managing Technical Debt.

[9]  Emerson R. Murphy-Hill,et al.  Experiences gamifying developer adoption of practices and tools , 2014, ICSE Companion.

[10]  Foutse Khomh,et al.  An Empirical Study of the Impact of Two Antipatterns, Blob and Spaghetti Code, on Program Comprehension , 2011, 2011 15th European Conference on Software Maintenance and Reengineering.

[11]  Rainer Koschke,et al.  How do professional developers comprehend software? , 2012, 2012 34th International Conference on Software Engineering (ICSE).

[12]  Radu Marinescu,et al.  Assessing technical debt by identifying design flaws in software systems , 2012, IBM J. Res. Dev..

[13]  Thomas Fritz,et al.  A Field Study on Fostering Structural Navigation with Prodet , 2015, 2015 IEEE/ACM 37th IEEE International Conference on Software Engineering.

[14]  Yuanfang Cai,et al.  Comparing four approaches for technical debt identification , 2014, Software Quality Journal.

[15]  Andreas Stefik,et al.  Toward Intuitive Programming Languages , 2011, 2011 IEEE 19th International Conference on Program Comprehension.

[16]  Rainer Koschke,et al.  Program complexity metrics and programmer opinions , 2012, 2012 20th IEEE International Conference on Program Comprehension (ICPC).

[17]  Thomas D. LaToza,et al.  Maintaining mental models: a study of developer work habits , 2006, ICSE.

[18]  Jonathan I. Maletic,et al.  Measuring Class Importance in the Context of Design Evolution , 2010, 2010 IEEE 18th International Conference on Program Comprehension.

[19]  Joost Visser,et al.  An empirical model of technical debt and interest , 2011, MTD '11.

[20]  Radu Marinescu,et al.  Detection strategies: metrics-based rules for detecting design flaws , 2004, 20th IEEE International Conference on Software Maintenance, 2004. Proceedings..

[21]  Bill Curtis,et al.  Estimating the size, cost, and types of Technical Debt , 2012, 2012 Third International Workshop on Managing Technical Debt (MTD).

[22]  Ward Cunningham,et al.  The WyCash portfolio management system , 1992, OOPSLA '92.