Exploring the suitability of source code metrics for indicating architectural inconsistencies

Software architecture degradation is a phenomenon that frequently occurs during software evolution. Source code anomalies are one of the several aspects that potentially contribute to software architecture degradation. Many techniques for automating the detection of such anomalies are based on source code metrics. It is, however, unclear how accurate these techniques are in identifying the architecturally relevant anomalies in a system. The objective of this paper is to shed light on the extent to which source code metrics on their own can be used to characterize classes contributing to software architecture degradation. We performed a multi-case study on three open-source systems for each of which we gathered the intended architecture and data for 49 different source code metrics taken from seven different code quality tools. This data was analyzed to explore the links between architectural inconsistencies, as detected by applying reflexion modeling, and metric values indicating potential design problems at the implementation level. The results show that there does not seem to be a direct correlation between metrics and architectural inconsistencies. For many metrics, however, classes more problematic as indicated by their metric value seem significantly more likely to contribute to inconsistencies than less problematic classes. In particular, the fan-in, a classes’ public API, and method counts seem to be suitable indicators. The fan-in metric seems to be a particularly interesting indicator, as class size does not seem to have a confounding effect on this metric. This finding may be useful for focusing code restructuring efforts on architecturally relevant metrics in case the intended architecture is not explicitly specified and to further improve architecture recovery and consistency checking tool support.

[1]  Nour Ali,et al.  Characterizing real-time reflexion-based architecture recovery: an in-vivo multi-case study , 2012, QoSA '12.

[2]  Sebastian Herold,et al.  Towards flexible automated software architecture erosion diagnosis and treatment , 2014, WICSA '14 Companion.

[3]  Deepak Goyal,et al.  A hierarchical model for object-oriented design quality assessment , 2015 .

[4]  Morgan Ericsson,et al.  The design and implementation of a software infrastructure for IQ assessment , 2012, Int. J. Inf. Qual..

[5]  Radu Marinescu,et al.  Detection strategies: metrics-based rules for detecting design flaws , 2004, 20th IEEE International Conference on Software Maintenance, 2004. Proceedings..

[6]  Alessandro F. Garcia,et al.  On the Relevance of Code Anomalies for Identifying Architecture Degradation Symptoms , 2012, 2012 16th European Conference on Software Maintenance and Reengineering.

[7]  Nenad Medvidovic,et al.  Are automatically-detected code anomalies relevant to architectural modularity?: an exploratory analysis of evolving systems , 2012, AOSD.

[8]  Jan Bosch,et al.  Design erosion: problems and causes , 2002, J. Syst. Softw..

[9]  William Pugh,et al.  Improving software quality with static analysis , 2007, PASTE '07.

[10]  Sjaak Brinkkemper,et al.  Journal of Software Maintenance and Evolution: Research and Practice Design Preservation over Subsequent Releases of a Software Product: a Case Study of Baan Erp , 2022 .

[11]  Leonard J. Bass,et al.  Designing software architectures to achieve quality attribute requirements , 2005, IEE Proc. Softw..

[12]  Lionel C. Briand,et al.  A Unified Framework for Coupling Measurement in Object-Oriented Systems , 1999, IEEE Trans. Software Eng..

[13]  Nour Ali,et al.  JITTAC: A Just-in-Time tool for architectural consistency , 2013, 2013 35th International Conference on Software Engineering (ICSE).

[14]  Alessandro F. Garcia,et al.  Code Anomalies Flock Together: Exploring Code Anomaly Agglomerations for Locating Design Problems , 2016, 2016 IEEE/ACM 38th International Conference on Software Engineering (ICSE).

[15]  Nenad Medvidovic,et al.  Toward a Catalogue of Architectural Bad Smells , 2009, QoSA.

[16]  David Notkin,et al.  Software Reflexion Models: Bridging the Gap between Design and Implementation , 2001, IEEE Trans. Software Eng..

[17]  Eoin Woods,et al.  Software Systems Architecture: Working with Stakeholders Using Viewpoints and Perspectives , 2005 .

[18]  Diomidis Spinellis,et al.  Tool Writing: A Forgotten Art? , 2005, IEEE Softw..

[19]  Erhard Plödereder,et al.  Bauhaus - A Tool Suite for Program Analysis and Reverse Engineering , 2006, Ada-Europe.

[20]  Mauricio A. Saca Refactoring improving the design of existing code , 2017, 2017 IEEE 37th Central America and Panama Convention (CONCAPAN XXXVII).

[21]  Alexander L. Wolf,et al.  Acm Sigsoft Software Engineering Notes Vol 17 No 4 Foundations for the Study of Software Architecture , 2022 .

[22]  Stéphane Ducasse,et al.  Object-Oriented Metrics in Practice , 2005 .

[23]  Jens Knodel,et al.  A Comparison of Static Architecture Compliance Checking Approaches , 2007, 2007 Working IEEE/IFIP Conference on Software Architecture (WICSA'07).

[24]  Eila Niemelä,et al.  A Survey on Software Architecture Analysis Methods , 2002, IEEE Trans. Software Eng..

[25]  Dharini Balasubramaniam,et al.  Controlling software architecture erosion: A survey , 2012, J. Syst. Softw..

[26]  Jens Knodel,et al.  SAVE: Software Architecture Visualization and Evaluation , 2009, 2009 13th European Conference on Software Maintenance and Reengineering.

[27]  Sebastian Herold,et al.  Erratum to: Architecture consistency: State of the practice, challenges and requirements , 2017, Empirical Software Engineering.

[28]  Sebastian Herold,et al.  Real-Time Reflexion Modelling in architecture reconciliation: A multi case study , 2015, Inf. Softw. Technol..

[29]  Ricardo Terra,et al.  Static Architecture-Conformance Checking: An Illustrative Overview , 2010, IEEE Software.

[30]  Chris F. Kemerer,et al.  A Metrics Suite for Object Oriented Design , 2015, IEEE Trans. Software Eng..

[31]  S. Shapiro,et al.  An Analysis of Variance Test for Normality (Complete Samples) , 1965 .

[32]  Jean-Louis Letouzey,et al.  The SQALE method for evaluating Technical Debt , 2012, 2012 Third International Workshop on Managing Technical Debt (MTD).

[33]  Andy Zaidman,et al.  Analyzing the State of Static Analysis: A Large-Scale Evaluation in Open Source Software , 2016, 2016 IEEE 23rd International Conference on Software Analysis, Evolution, and Reengineering (SANER).

[34]  Torbjörn Ekman,et al.  .QL: Object-Oriented Queries Made Easy , 2007, GTTSE.

[35]  Márcio de Oliveira Barros,et al.  Learning from optimization: A case study with Apache Ant , 2015, Inf. Softw. Technol..

[36]  Brian Fitzgerald,et al.  An Approach for Modeling Architectural Design Rules in UML and its Application to Embedded Software , 2012, TSEM.

[37]  Sebastian Herold,et al.  Are code smell detection tools suitable for detecting architecture degradation? , 2017, ECSA.

[38]  Peng Liang,et al.  How Do Open Source Communities Document Software Architecture: An Exploratory Survey , 2014, 2014 19th International Conference on Engineering of Complex Computer Systems.

[39]  Robert J. Winter Cpt Agile Software Development: Principles, Patterns, and Practices , 2014 .

[40]  Khaled El Emam,et al.  The Confounding Effect of Class Size on the Validity of Object-Oriented Metrics , 2001, IEEE Trans. Software Eng..

[41]  John C. Munson,et al.  Software evolution: code delta and code churn , 2000, J. Syst. Softw..

[42]  Sebastian Herold,et al.  Architecture consistency: State of the practice, challenges and requirements , 2017, Empirical Software Engineering.

[43]  Sebastian Herold,et al.  Complementing model-driven development for the detection of software architecture erosion , 2013, 2013 5th International Workshop on Modeling in Software Engineering (MiSE).

[44]  Muhammad Ali Babar,et al.  Assessing architectural drift in commercial software development: a case study , 2011, Softw. Pract. Exp..

[45]  Morgan Ericsson,et al.  The relationship of code churn and architectural violations in the open source software JabRef , 2017, ECSA.

[46]  Patrick Dohrmann,et al.  Rule-Based Architectural Compliance Checks for Enterprise Architecture Management , 2009, 2009 IEEE International Enterprise Distributed Object Computing Conference.

[47]  R Core Team,et al.  R: A language and environment for statistical computing. , 2014 .

[48]  Rudolf Ferenc,et al.  Impact of Version History Metrics on Maintainability , 2015, 2015 8th International Conference on Advanced Software Engineering & Its Applications (ASEA).

[49]  Michael W. Godfrey,et al.  Secrets from the Monster: Extracting Mozilla’s Software Architecture , 2000 .

[50]  Sebastian Herold,et al.  Evidence in architecture degradation and consistency checking research: preliminary results from a literature review , 2016, ECSA Workshops.

[51]  Richard N. Taylor,et al.  Software architecture: foundations, theory, and practice , 2009, 2010 ACM/IEEE 32nd International Conference on Software Engineering.

[52]  Sebastian Herold,et al.  Detection of violation causes in reflexion models , 2015, 2015 IEEE 22nd International Conference on Software Analysis, Evolution, and Reengineering (SANER).

[53]  G. Upton Fisher's Exact Test , 1992 .

[54]  Dalton Serey Guerrero,et al.  On the Evolutionary Nature of Architectural Violations , 2012, 2012 19th Working Conference on Reverse Engineering.

[55]  Audris Mockus,et al.  Does Code Decay? Assessing the Evidence from Change Management Data , 2001, IEEE Trans. Software Eng..

[56]  G. Ann Campbell,et al.  SonarQube in Action , 2013 .

[57]  Per Runeson,et al.  Guidelines for conducting and reporting case study research in software engineering , 2009, Empirical Software Engineering.

[58]  Lorin Hochstein,et al.  Combating architectural degeneration: a survey , 2005, Inf. Softw. Technol..

[59]  Francesca Arcelli Fontana,et al.  Automatic Metric Thresholds Derivation for Code Smell Detection , 2015, 2015 IEEE/ACM 6th International Workshop on Emerging Trends in Software Metrics.

[60]  M. Brewer,et al.  Research Design and Issues of Validity , 2000 .

[61]  Mark Harman,et al.  8 th Working Conference on Reverse Engineering , 2001 .

[62]  Dalton Serey Guerrero,et al.  Five Years of Software Architecture Checking: A Case Study of Eclipse , 2015, IEEE Software.

[63]  Vineet Sinha,et al.  Using dependency models to manage complex software architecture , 2005, OOPSLA '05.

[64]  Jens Knodel,et al.  Architecture Compliance Checking - Experiences from Successful Technology Transfer to Industry , 2008, 2008 12th European Conference on Software Maintenance and Reengineering.

[65]  K. Rangarajan,et al.  Modularization of a Large-Scale Business Application: A Case Study , 2009, IEEE Software.