Are code smell detection tools suitable for detecting architecture degradation?

Context: Several studies suggest that there is a relation between code smells and architecture degradation. They claim that classes, which have degraded architecture-wise, can be detected on the basis of code smells, at least if these are manually identified in the source code. Objective: To evaluate the suitability of contemporary code smell detection tools by combining different smell categories for finding classes that show symptoms of architecture degradation. Method: A case study is performed in which architectural inconsistencies in an open source system are detected via reflexion modeling and code smell metrics are collected through several tools. Using data mining techniques, we investigate if it is possible to automatically and accurately classify classes connected to architectural inconsistencies based on the gathered code smell data. Results: Results suggest that existing code smell detection techniques, as implemented in contemporary tools, are not sufficiently accurate for classifying whether a class contains architectural inconsistencies, even when combining categories of code smells. Conclusion: It seems that current automated code smell detection techniques require fine-tuning for a specific system if they are to be used for finding classes with architectural inconsistencies. More research on architecture violation causes is needed to build more accurate detection techniques that work out-of-the-box.

[1]  David Notkin,et al.  Software Reflexion Models: Bridging the Gap between Design and Implementation , 2001, IEEE Trans. Software Eng..

[2]  M. Brewer,et al.  Research Design and Issues of Validity , 2000 .

[3]  Peng Liang,et al.  How Do Open Source Communities Document Software Architecture: An Exploratory Survey , 2014, 2014 19th International Conference on Engineering of Complex Computer Systems.

[4]  Ian H. Witten,et al.  Chapter 1 – What's It All About? , 2011 .

[5]  Dharini Balasubramaniam,et al.  Controlling software architecture erosion: A survey , 2012, J. Syst. Softw..

[6]  Stefan Wagner,et al.  Are comprehensive quality models necessary for evaluating software quality? , 2013, PROMISE.

[7]  Radu Marinescu,et al.  Detection strategies: metrics-based rules for detecting design flaws , 2004, 20th IEEE International Conference on Software Maintenance, 2004. Proceedings..

[8]  Lorin Hochstein,et al.  Combating architectural degeneration: a survey , 2005, Inf. Softw. Technol..

[9]  Alessandro F. Garcia,et al.  On the Relevance of Code Anomalies for Identifying Architecture Degradation Symptoms , 2012, 2012 16th European Conference on Software Maintenance and Reengineering.

[10]  Jazi Eko Istiyanto,et al.  Modularity Index Metrics for Java-Based Open Source Software Projects , 2013, ArXiv.

[11]  Mauricio A. Saca Refactoring improving the design of existing code , 2017, 2017 IEEE 37th Central America and Panama Convention (CONCAPAN XXXVII).

[12]  Francesca Arcelli Fontana,et al.  Towards Assessing Software Architecture Quality by Exploiting Code Smell Relations , 2015, 2015 IEEE/ACM 2nd International Workshop on Software Architecture and Metrics.

[13]  R Core Team,et al.  R: A language and environment for statistical computing. , 2014 .

[14]  Harrison Si,et al.  Handbook of Research Methods in Social and Personality Psychology: Author Index , 2013 .

[15]  Sebastian Herold,et al.  Evidence in architecture degradation and consistency checking research: preliminary results from a literature review , 2016, ECSA Workshops.

[16]  Francesca Arcelli Fontana,et al.  Automatic detection of bad smells in code: An experimental assessment , 2012, J. Object Technol..

[17]  Richard N. Taylor Software architecture: many faces, many places, yet a central discipline , 2009, ESEC/FSE '09.

[18]  Alessandro F. Garcia,et al.  Code Anomalies Flock Together: Exploring Code Anomaly Agglomerations for Locating Design Problems , 2016, 2016 IEEE/ACM 38th International Conference on Software Engineering (ICSE).

[19]  Nour Ali,et al.  JITTAC: A Just-in-Time tool for architectural consistency , 2013, 2013 35th International Conference on Software Engineering (ICSE).

[20]  Richard N. Taylor,et al.  Software architecture: foundations, theory, and practice , 2009, 2010 ACM/IEEE 32nd International Conference on Software Engineering.

[21]  Nenad Medvidovic,et al.  Are automatically-detected code anomalies relevant to architectural modularity?: an exploratory analysis of evolving systems , 2012, AOSD.

[22]  William Pugh,et al.  Improving software quality with static analysis , 2007, PASTE '07.

[23]  Ioannis Stamelos,et al.  Open Source Software: How Can Design Metrics Facilitate Architecture Recovery? , 2011, ArXiv.

[24]  Kurt Hornik,et al.  Open-source machine learning: R meets Weka , 2009, Comput. Stat..

[25]  G. Ann Campbell,et al.  SonarQube in Action , 2013 .

[26]  Ian Witten,et al.  Data Mining , 2000 .

[27]  Harry Zhang,et al.  The Optimality of Naive Bayes , 2004, FLAIRS.

[28]  Yann-Gaël Guéhéneuc,et al.  Improving Bug Location Using Binary Class Relationships , 2012, 2012 IEEE 12th International Working Conference on Source Code Analysis and Manipulation.