Developer targeted analytics: Supporting software development decisions with runtime information

Runtime information of deployed software has been used by business and operations units to make informed decisions under the term “analytics”. However, decisions made by software engineers in the course of evolving software have, for the most part, been based on personal belief and gut-feeling. This could be attributed to software development being, for the longest time, viewed as an activity that is detached from the notion of operating software in a production environment. In recent years, this view has been challenged with the emergence of the DevOps movement, which aim is to promote cross-functional capabilities of development and operations activities within teams. This shift in process and mindset requires analytics tools that specifically target software developers. In this research, I investigate approaches to support developers in their decision-making by incorporating runtime information in source code and provide live feedback in IDEs by predicting the impact of code changes.

[1]  Harald C. Gall,et al.  The making of cloud applications: an empirical study on software development for the cloud , 2014, ESEC/SIGSOFT FSE.

[2]  Raffaela Mirandola,et al.  Dealing with uncertainties in the performance modelling of software systems , 2014, QoSA '14.

[3]  LeitnerPhilipp,et al.  Patterns in the ChaosA Study of Performance Variation and Predictability in Public IaaS Clouds , 2016 .

[4]  Harald C. Gall,et al.  Runtime metric meets developer: building better cloud applications using feedback , 2015, Onward!.

[5]  Thomas Zimmermann,et al.  Analytics for software development , 2010, FoSER '10.

[6]  Rick Salay,et al.  Partial models: Towards modeling and reasoning with uncertainty , 2012, 2012 34th International Conference on Software Engineering (ICSE).

[7]  Ketil Stølen,et al.  A Practical Approach to Uncertainty Handling and Estimate Acquisition in Model-based Prediction of System Quality , 2011 .

[8]  Steffen Becker,et al.  The Palladio component model for model-driven performance prediction , 2009, J. Syst. Softw..

[9]  Andrew Lumsdaine,et al.  The Value of Variance , 2016, ICPE.

[10]  Steffen Lohmann,et al.  Visual monitoring of process runs: An application study for stored procedures , 2016, 2016 IEEE Pacific Visualization Symposium (PacificVis).

[11]  Robert Heinrich,et al.  Deriving Work Plans for Solving Performance and Scalability Problems , 2014, EPEW.

[12]  Alexandre Bergel,et al.  Performance evolution blueprint: Understanding the impact of software evolution on performance , 2013, 2013 First IEEE Working Conference on Software Visualization (VISSOFT).

[13]  Dongmei Zhang,et al.  Software Analytics in Practice , 2013, IEEE Software.

[14]  Burak Turhan,et al.  Data mining for software engineering and humans in the loop , 2016, Progress in Artificial Intelligence.

[15]  Qi Luo,et al.  Mining Performance Regression Inducing Code Changes in Evolving Software , 2016, 2016 IEEE/ACM 13th Working Conference on Mining Software Repositories (MSR).

[16]  Arie van Deursen,et al.  A Controlled Experiment for Program Comprehension through Trace Visualization , 2011, IEEE Transactions on Software Engineering.

[17]  Johannes Lüthi,et al.  Interval parameters for capturing uncertainties in an EJB performance model , 2001, SIGMETRICS '01.

[18]  Premkumar T. Devanbu,et al.  Belief & Evidence in Empirical Software Engineering , 2016, 2016 IEEE/ACM 38th International Conference on Software Engineering (ICSE).

[19]  P. Goodwin,et al.  Decision making and planning under low levels of predictability: enhancing the scenario method , 2009 .

[20]  Bora Caglayan,et al.  A Retrospective Study of Software Analytics Projects: In-Depth Interviews with Practitioners , 2013, IEEE Software.

[21]  Xiaoyan Zhu,et al.  Does bug prediction support human developers? Findings from a Google case study , 2013, 2013 35th International Conference on Software Engineering (ICSE).

[22]  Schahram Dustdar,et al.  Identifying Root Causes of Web Performance Degradation Using Changepoint Analysis , 2014, ICWE.

[23]  Wilhelm Hasselbring,et al.  ExplorViz: Visual Runtime Behavior Analysis of Enterprise Application Landscapes , 2015, ECIS.

[24]  Steffen Lehnert,et al.  A review of software change impact analysis , 2011 .

[25]  Earl T. Barr,et al.  Uncertainty, risk, and information value in software requirements and architecture , 2014, ICSE.

[26]  Oscar Nierstrasz,et al.  Augmenting static source views in IDEs with dynamic metrics , 2009, 2009 IEEE International Conference on Software Maintenance.

[27]  Philipp Leitner,et al.  Patterns in the Chaos—A Study of Performance Variation and Predictability in Public IaaS Clouds , 2014, ACM Trans. Internet Techn..

[28]  Johan A. Pouwelse,et al.  Understanding software performance regressions using differential flame graphs , 2015, 2015 IEEE 22nd International Conference on Software Analysis, Evolution, and Reengineering (SANER).

[29]  Philipp Leitner,et al.  All the Services Large and Micro: Revisiting Industrial Practice in Services Computing , 2015, ICSOC Workshops.

[30]  Fabian Beck,et al.  In situ understanding of performance bottlenecks through visually augmented code , 2013, 2013 21st International Conference on Program Comprehension (ICPC).

[31]  R. Lipshitz,et al.  Coping with Uncertainty: A Naturalistic Decision-Making Analysis , 1997 .