Interactive Production Performance Feedback in the IDE

Because of differences between development and production environments, many software performance problems are detected only after software enters production. We present PerformanceHat, a new system that uses profiling information from production executions to develop a global performance model suitable for integration into interactive development environments. PerformanceHat's ability to incrementally update this global model as the software is changed in the development environment enables it to deliver near real-time predictions of performance consequences reflecting the impact on the production environment. We implement PerformanceHat as an Eclipse plugin and evaluate it in a controlled experiment with 20 professional software developers implementing several software maintenance tasks using our approach and a representative baseline (Kibana). Our results indicate that developers using PerformanceHat were significantly faster in (1) detecting the performance problem, and (2) finding the root-cause of the problem. These results provide encouraging evidence that our approach helps developers detect, prevent, and debug production performance problems during development before the problem manifests in production.

[1]  Yuanyuan Zhou,et al.  Understanding Customer Problem Troubleshooting from Storage System Logs , 2009, FAST.

[2]  Spencer Rugaber,et al.  Resumption strategies for interrupted programming tasks , 2009, 2009 IEEE 17th International Conference on Program Comprehension.

[3]  Marco Tulio Valente,et al.  Learning from Source Code History to Identify Performance Failures , 2016, ICPE.

[4]  K. Rautiainen,et al.  Towards a conceptual framework and tool support for linking long-term product and business planning with agile software development , 2008, SDG '08.

[5]  Matias Martinez,et al.  Fine-grained and accurate source code differencing , 2014, ASE.

[6]  Harald C. Gall,et al.  The making of cloud applications: an empirical study on software development for the cloud , 2014, ESEC/SIGSOFT FSE.

[7]  Arvind Satyanarayan,et al.  Augmenting Code with In Situ Visualizations to Aid Program Understanding , 2018, CHI.

[8]  Robert Heinrich,et al.  Deriving Work Plans for Solving Performance and Scalability Problems , 2014, EPEW.

[9]  Mira Mezini,et al.  On the Positive Effect of Reactive Programming on Software Comprehension: An Empirical Study , 2017, IEEE Transactions on Software Engineering.

[10]  Stefan Hanenberg,et al.  How do API documentation and static typing affect API usability? , 2014, ICSE.

[11]  Harald C. Gall,et al.  Change Distilling:Tree Differencing for Fine-Grained Source Code Change Extraction , 2007, IEEE Transactions on Software Engineering.

[12]  Spencer Rugaber,et al.  Resumption strategies for interrupted programming tasks , 2009, ICPC.

[13]  Steffen Becker,et al.  The Palladio component model for model-driven performance prediction , 2009, J. Syst. Softw..

[14]  James R. Hamilton,et al.  On Designing and Deploying Internet-Scale Services , 2007, LISA.

[15]  Robert S. Arnold,et al.  Software Change Impact Analysis , 1996 .

[16]  Paola Inverardi,et al.  Model-based performance prediction in software development: a survey , 2004, IEEE Transactions on Software Engineering.

[17]  Paolo Romano,et al.  Enhancing Performance Prediction Robustness by Combining Analytical Modeling and Machine Learning , 2015, ICPE.

[18]  Fabian Beck,et al.  In situ understanding of performance bottlenecks through visually augmented code , 2013, 2013 21st International Conference on Program Comprehension (ICPC).

[19]  Brad A. Myers,et al.  An Exploratory Study of How Developers Seek, Relate, and Collect Relevant Information during Software Maintenance Tasks , 2006, IEEE Transactions on Software Engineering.

[20]  Denys Poshyvanyk,et al.  Combining Formal Concept Analysis with Information Retrieval for Concept Location in Source Code , 2007, 15th IEEE International Conference on Program Comprehension (ICPC '07).

[21]  R. Atkinson,et al.  Accessing Hidden and Hard-to-Reach Populations: Snowball Research Strategies , 2001 .

[22]  Steffen Lehnert,et al.  A taxonomy for software change impact analysis , 2011, IWPSE-EVOL '11.

[23]  Cor-Paul Bezemer,et al.  Studying the Effectiveness of Application Performance Management (APM) Tools for Detecting Performance Regressions for Web Applications: An Experience Report , 2016, 2016 IEEE/ACM 13th Working Conference on Mining Software Repositories (MSR).

[24]  Frédéric Beck,et al.  Powering Monitoring Analytics with ELK stack , 2015, AIMS 2015.

[25]  Wilhelm Hasselbring,et al.  Kieker: a framework for application performance monitoring and dynamic software analysis , 2012, ICPE '12.

[26]  Rob Miller,et al.  Addressing misconceptions about code with always-on programming visualizations , 2014, CHI.

[27]  Ramesh Govindan,et al.  Calculating source line level energy information for Android applications , 2013, ISSTA.

[28]  Laurie A. Williams,et al.  Continuous Deployment at Facebook and OANDA , 2016, 2016 IEEE/ACM 38th International Conference on Software Engineering Companion (ICSE-C).

[29]  Claes Wohlin,et al.  Experimentation in software engineering: an introduction , 2000 .

[30]  Rainer Koschke,et al.  Locating Features in Source Code , 2003, IEEE Trans. Software Eng..

[31]  Qi Luo,et al.  Mining Performance Regression Inducing Code Changes in Evolving Software , 2019 .

[32]  Michael D. Ernst,et al.  Reducing wasted development time via continuous testing , 2003, 14th International Symposium on Software Reliability Engineering, 2003. ISSRE 2003..