Performance Evolution Matrix: Visualizing Performance Variations Along Software Versions

Software performance may be significantly affected by source code modifications. Understanding the effect of these changes along different software versions is a challenging and necessary activity to debug performance failures. It is not sufficiently supported by existing profiling tools and visualization approaches. Practitioners would need to manually compare calling context trees and call graphs. We aim at better supporting the comparison of benchmark executions along multiple software versions. We propose Performance Evolution Matrix, an interactive visualization technique that contrasts runtime metrics to source code changes. It combines a comparison of time series data and execution graphs in a matrix layout, showing performance and source code metrics at different levels of granularity. The approach guides practitioners from the high-level identification of a performance regression to the changes that might have caused the issue. We conducted a controlled experiment with 12 participants to provide empirical evidence of the viability of our method. The results indicate that our approach can reduce the effort for identifying sources of performance regressions compared to traditional profiling visualizations.

[1]  Chandra Krintz,et al.  Tracking performance across software revisions , 2009, PPPJ '09.

[2]  Alexandre Bergel,et al.  Performance evolution blueprint: Understanding the impact of software evolution on performance , 2013, 2013 First IEEE Working Conference on Software Visualization (VISSOFT).

[3]  Roozbeh Farahbod,et al.  Automated root cause isolation of performance regressions during software development , 2013, ICPE '13.

[4]  Michael Burch,et al.  Timeline trees: visualizing sequences of transactions in information hierarchies , 2008, AVI '08.

[5]  Lucian Voinea,et al.  The Solid* toolset for software visual analytics of program structure and metrics comprehension: From research prototype to product , 2014, Sci. Comput. Program..

[6]  Michael Burch,et al.  Rapid Serial Visual Presentation in dynamic graph visualization , 2012, 2012 IEEE Symposium on Visual Languages and Human-Centric Computing (VL/HCC).

[7]  Alexandre Bergel,et al.  Tracking down performance variation against source code evolution , 2015, DLS.

[8]  Jarke J. van Wijk,et al.  Visual Comparison of Hierarchically Organized Data , 2008, Comput. Graph. Forum.

[9]  G. Lommerse,et al.  The visual code navigator: an interactive toolset for source code investigation , 2005, IEEE Symposium on Information Visualization, 2005. INFOVIS 2005..

[10]  Stuart K. Card,et al.  Evaluation of mouse, rate-controlled isometric joystick, step keys, and text keys, for text selection on a CRT , 1987 .

[11]  Fabian Beck,et al.  In situ understanding of performance bottlenecks through visually augmented code , 2013, 2013 21st International Conference on Program Comprehension (ICPC).

[12]  Johan A. Pouwelse,et al.  Detecting and analyzing I/O performance regressions , 2014, J. Softw. Evol. Process..

[13]  Lucian Voinea,et al.  CVSscan: visualization of code evolution , 2005, SoftVis '05.

[14]  E. Tufte Beautiful Evidence , 2006 .

[15]  Matthias Hauswirth,et al.  Trevis: a context tree visualization & analysis framework and its use for classifying performance failure reports , 2010, SOFTVIS '10.

[16]  Michael Burch,et al.  Visualizing the Evolution of Compound Digraphs with TimeArcTrees , 2009, Comput. Graph. Forum.

[17]  Claes Wohlin,et al.  Experimentation in Software Engineering , 2000, The Kluwer International Series in Software Engineering.

[18]  Oscar Waddell,et al.  Visualizing the performance of higher-order programs , 1998, PASTE '98.

[19]  Alexandre Bergel,et al.  Execution profiling blueprints , 2012, Softw. Pract. Exp..

[20]  Qi Luo,et al.  Mining Performance Regression Inducing Code Changes in Evolving Software , 2019 .

[21]  Shan Lu,et al.  Understanding and detecting real-world performance bugs , 2012, PLDI.

[22]  Kasper Hornbæk,et al.  Some Whys and Hows of Experiments in Human-Computer Interaction , 2013, Found. Trends Hum. Comput. Interact..

[23]  Alexandru Telea,et al.  Multiscale visual comparison of execution traces , 2013, 2013 21st International Conference on Program Comprehension (ICPC).

[24]  Marco Tulio Valente,et al.  Learning from Source Code History to Identify Performance Failures , 2016, ICPE.

[25]  Michael Burch,et al.  Parallel Edge Splatting for Scalable Dynamic Graph Visualization , 2011, IEEE Transactions on Visualization and Computer Graphics.

[26]  Johan A. Pouwelse,et al.  Understanding software performance regressions using differential flame graphs , 2015, 2015 IEEE 22nd International Conference on Software Analysis, Evolution, and Reengineering (SANER).

[27]  Jong-Deok Choi,et al.  Accurate, efficient, and adaptive calling context profiling , 2006, PLDI '06.

[28]  Daniel Weiskopf,et al.  Hue-Preserving Color Blending , 2009, IEEE Transactions on Visualization and Computer Graphics.

[29]  Alexandre Bergel,et al.  Counting Messages as a Proxy for Average Execution Time in Pharo , 2011, ECOOP.

[30]  Michele Lanza,et al.  The evolution matrix: recovering software evolution using software visualization techniques , 2001, IWPSE '01.

[31]  Xiao Ma,et al.  Performance regression testing target prioritization via performance risk analysis , 2014, ICSE.

[32]  Jong-Deok Choi,et al.  Perfdiff: a framework for performance difference analysis in a virtual machine environment , 2008, CGO '08.

[33]  Yepang Liu,et al.  Characterizing and detecting performance bugs for smartphone applications , 2014, ICSE.