Organizing the technical debt landscape

To date, several methods and tools for detecting source code and design anomalies have been developed. While each method focuses on identifying certain classes of source code anomalies that potentially relate to technical debt (TD), the overlaps and gaps among these classes and TD have not been rigorously demonstrated. We propose to construct a seminal technical debt landscape as a way to visualize and organize research on the subject.

[1]  James M. Bieman,et al.  How Software Designs Decay: A Pilot Study of Pattern Evolution , 2007, First International Symposium on Empirical Software Engineering and Measurement (ESEM 2007).

[2]  Leon Moonen,et al.  Evaluating the relation between coding standard violations and faultswithin and across software versions , 2009, 2009 6th IEEE International Working Conference on Mining Software Repositories.

[3]  Philip M. Johnson,et al.  Generalizing fault contents from a few classes , 2007, ESEM 2007.

[4]  Ward Cunningham,et al.  The WyCash portfolio management system , 1992, OOPSLA '92.

[5]  Michael D. Ernst,et al.  Which warnings should I fix first? , 2007, ESEC-FSE '07.

[6]  Thomas J. Mowbray,et al.  AntiPatterns: Refactoring Software, Architectures, and Projects in Crisis , 1998 .

[7]  Stéphane Ducasse,et al.  Object-Oriented Metrics in Practice , 2005 .

[8]  James M. Bieman,et al.  Testing Consequences of Grime Buildup in Object Oriented Design Patterns , 2008, 2008 1st International Conference on Software Testing, Verification, and Validation.

[9]  Yuanfang Cai,et al.  Detecting software modularity violations , 2011, 2011 33rd International Conference on Software Engineering (ICSE).

[10]  Jerry L. Archibald,et al.  Addendum to the proceedings on Object-oriented programming systems, languages, and applications (Addendum) , 1991, OOPSLA 1991.

[11]  Wayne G. Lutters,et al.  Revealing actual documentation usage in software maintenance through war stories , 2007, Inf. Softw. Technol..

[12]  William Pugh,et al.  The Google FindBugs fixit , 2010, ISSTA '10.

[13]  Marco Torchiano,et al.  Assessing the precision of FindBugs by mining Java projects developed at a university , 2010, 2010 7th IEEE Working Conference on Mining Software Repositories (MSR 2010).

[14]  James M. Bieman,et al.  A multiple case study of design pattern decay, grime, and rot in evolving software systems , 2012, Software Quality Journal.

[15]  Yann-Gaël Guéhéneuc,et al.  Using design patterns and constraints to automate the detection and correction of inter-class design defects , 2001, Proceedings 39th International Conference and Exhibition on Technology of Object-Oriented Languages and Systems. TOOLS 39.

[16]  Radu Marinescu,et al.  Detection strategies: metrics-based rules for detecting design flaws , 2004, 20th IEEE International Conference on Software Maintenance, 2004. Proceedings..

[17]  Daniela Cruzes,et al.  Are all code smells harmful? A study of God Classes and Brain Classes in the evolution of three open source systems , 2010, 2010 IEEE International Conference on Software Maintenance.

[18]  Robert L. Nord,et al.  Managing technical debt in software-reliant systems , 2010, FoSER '10.

[19]  Forrest Shull,et al.  Domain-specific tailoring of code smells: an empirical study , 2010, 2010 ACM/IEEE 32nd International Conference on Software Engineering.

[20]  Yuanfang Cai,et al.  Comparing four approaches for technical debt identification , 2014, Software Quality Journal.

[21]  Forrest Shull,et al.  Building empirical support for automated code smell detection , 2010, ESEM '10.

[22]  Marco Torchiano,et al.  An empirical validation of FindBugs issues related to defects , 2011 .

[23]  Michael D. Ernst,et al.  Prioritizing Warning Categories by Analyzing Software History , 2007, Fourth International Workshop on Mining Software Repositories (MSR'07:ICSE Workshops 2007).

[24]  Forrest Shull,et al.  Investigating the impact of design debt on software quality , 2011, MTD '11.