Hierarchical software landscape visualization for system comprehension: A controlled experiment

In many enterprises the number of deployed applications is constantly increasing. Those applications - often several hundreds - form large software landscapes. The comprehension of such landscapes is frequently impeded due to, for instance, architectural erosion, personnel turnover, or changing requirements. Therefore, an efficient and effective way to comprehend such software landscapes is required. The current state of the art often visualizes software landscapes via flat graph-based representations of nodes, applications, and their communication. In our ExplorViz visualization, we introduce hierarchical abstractions aiming at solving typical system comprehension tasks fast and accurately for large software landscapes. To evaluate our hierarchical approach, we conduct a controlled experiment comparing our hierarchical landscape visualization to a flat, state-of-the-art visualization. In addition, we thoroughly analyze the strategies employed by the participants and provide a package containing all our experimental data to facilitate the verifiability, reproducibility, and further extensibility of our results. We observed a statistically significant increase of 14 % in task correctness of the hierarchical visualization group compared to the flat visualization group in our experiment. The time spent on the system comprehension tasks did not show any significant differences. The results backup our claim that our hierarchical concept enhances the current state of the art in landscape visualization.

[1]  E. S. Pearson Biometrika tables for statisticians , 1967 .

[2]  Dietmar Pfahl,et al.  Reporting guidelines for controlled experiments in software engineering , 2005, 2005 International Symposium on Empirical Software Engineering, 2005..

[3]  Wilhelm Hasselbring,et al.  Live Visualization of Large Software Landscapes for Ensuring Architecture Conformance , 2014, ECSAW '14.

[4]  Eileen Kraemer,et al.  Designing your Next Empirical Study on Program Comprehension , 2007, 15th IEEE International Conference on Program Comprehension (ICPC '07).

[5]  Claes Wohlin,et al.  Experimentation in Software Engineering , 2000, The Kluwer International Series in Software Engineering.

[6]  S. Shapiro,et al.  An Analysis of Variance Test for Normality (Complete Samples) , 1965 .

[7]  Jochen Quante,et al.  Do Dynamic Object Process Graphs Support Program Understanding? - A Controlled Experiment. , 2008, 2008 16th IEEE International Conference on Program Comprehension.

[8]  Romain Robbes,et al.  Software systems as cities: a controlled experiment , 2011, 2011 33rd International Conference on Software Engineering (ICSE).

[9]  Samin Ishtiaq,et al.  "Can I Implement Your Algorithm?": A Model for Reproducible Research Software , 2014, ArXiv.

[10]  Irit Hadar,et al.  On the Contribution of UML Diagrams to Software System Comprehension , 2004, J. Object Technol..

[11]  H. Levene Robust tests for equality of variances , 1961 .

[12]  Michel R. V. Chaudron,et al.  Interactive Views to Improve the Comprehension of UML Models - An Experimental Validation , 2007, 15th IEEE International Conference on Program Comprehension (ICPC '07).

[13]  Victor R. Basili,et al.  A Methodology for Collecting Valid Software Engineering Data , 1984, IEEE Transactions on Software Engineering.

[14]  Hausi A. Müller,et al.  How do program understanding tools affect how programmers understand programs? , 1997, Proceedings of the Fourth Working Conference on Reverse Engineering.

[15]  R. Likert “Technique for the Measurement of Attitudes, A” , 2022, The SAGE Encyclopedia of Research Design.

[16]  Arie van Deursen,et al.  Trace visualization for program comprehension: A controlled experiment , 2009, 2009 IEEE 17th International Conference on Program Comprehension.

[17]  W. Shadish,et al.  Experimental and Quasi-Experimental Designs for Generalized Causal Inference , 2001 .

[18]  Wilhelm Hasselbring,et al.  Live trace visualization for comprehending large software landscapes: The ExplorViz approach , 2013, 2013 First IEEE Working Conference on Software Visualization (VISSOFT).

[19]  Shari Lawrence Pfleeger,et al.  Preliminary Guidelines for Empirical Research in Software Engineering , 2002, IEEE Trans. Software Eng..

[20]  Keith H. Bennett,et al.  From system comprehension to program comprehension , 2002, Proceedings 26th Annual International Computer Software and Applications.

[21]  Victor R. Basili The Role of Controlled Experiments in Software Engineering Research , 2006, Empirical Software Engineering Issues.

[22]  Marc Roper,et al.  A novel software visualisation model to support software comprehension , 2004, 11th Working Conference on Reverse Engineering.

[23]  Reinhard von Hanxleden,et al.  Drawing layered graphs with port constraints , 2014, J. Vis. Lang. Comput..

[24]  Wilhelm Hasselbring,et al.  ExplorViz: Visual Runtime Behavior Analysis of Enterprise Application Landscapes , 2015, ECIS.

[25]  Natalia Juristo Juzgado,et al.  Basics of Software Engineering Experimentation , 2010, Springer US.

[26]  Massimiliano Di Penta,et al.  Experimental Settings in Program Comprehension: Challenges and Open Issues , 2006, 14th IEEE International Conference on Program Comprehension (ICPC'06).

[27]  W. Hoeffding,et al.  Contributions to Probability and Statistics: Essays in Honor of Harold Hotelling. , 1962 .

[28]  Wilhelm Hasselbring,et al.  Experimental Data for: Hierarchical Software Landscape Visualization for System Comprehension: A Controlled Experiment , 2015 .

[29]  Andrian Marcus,et al.  Supporting the evolution of a software visualization tool through usability studies , 2005, 13th International Workshop on Program Comprehension (IWPC'05).

[30]  Rainer Koschke,et al.  Journal of Software Maintenance and Evolution: Research and Practice Software Visualization in Software Maintenance, Reverse Engineering, and Re-engineering: a Research Survey , 2022 .

[31]  Wilhelm Hasselbring,et al.  Comparing Trace Visualizations for Program Comprehension through Controlled Experiments , 2015, 2015 IEEE 23rd International Conference on Program Comprehension.

[32]  Y. B. Wah,et al.  Power comparisons of Shapiro-Wilk , Kolmogorov-Smirnov , Lilliefors and Anderson-Darling tests , 2011 .

[33]  Patrick Ogao,et al.  Evaluation of software visualization tools: Lessons learned , 2009, 2009 5th IEEE International Workshop on Visualizing Software for Understanding and Analysis.

[34]  Václav Rajlich,et al.  Towards standard for experiments in program comprehension , 1997, Proceedings Fifth International Workshop on Program Comprehension. IWPC'97.