Understanding the Role of Reporting in Work Item Tracking Systems for Software Development: An Industrial Case Study

Work item tracking systems such as Visual Studio Team Services, JIRA, BugZilla and GitHub issue tracker are widely used by software engineers. These systems are used to track work items such as features, user stories, bugs, plan sprints, distribute tasks across the team and prioritize the team's work. Such systems can help teams track the progress and manage the shipping of software. While these tracking systems give data about different work items in tabular format, using a reporting tool on top of them can help teams visualize the data related to their projects such as how many bugs are open and closed and which work items are assigned to a team member. While tools like Visual Studio and JIRA provide reporting services, it is important to understand how users leverage them in their projects to help improve the reporting services. In this study, we conduct an empirical investigation on the usage of Analytics Service - a reporting service provided by Visual Studio Team Services (VSTS) to build dashboards and reports out of their work item tracking data. In particular, we want to understand why and how users interact with Analytics Service and what are the outcomes and business decisions taken by stakeholders from reports built using Analytics Service. We perform semi-structured interviews and survey with users of Analytics Service to understand usage and challenges. Our report on qualitative and quantitative analysis can help organizations and engineers building similar tools or services.

[1]  Emerson R. Murphy-Hill,et al.  Improving developer participation rates in surveys , 2013, 2013 6th International Workshop on Cooperative and Human Aspects of Software Engineering (CHASE).

[2]  Andreas Jedlitschka,et al.  Evaluating a model of software managers' information needs: an experiment , 2010, ESEM '10.

[3]  A. Strauss,et al.  Grounded Theory in Practice , 1997 .

[4]  Shari Lawrence Pfleeger,et al.  Personal Opinion Surveys , 2008, Guide to Advanced Empirical Software Engineering.

[5]  Seija Komi-Sirviö,et al.  Measurement automation: methodological background and practical solutions a multiple case study , 2001, Proceedings Seventh International Software Metrics Symposium.

[6]  Emerson R. Murphy-Hill,et al.  Peer interaction effectively, yet infrequently, enables programmers to discover new tools , 2011, CSCW.

[7]  Andrew Begel,et al.  Analyze this! 145 questions for data scientists in software engineering , 2013, ICSE.

[8]  Austen Rainer,et al.  Persuading developers to "buy into" software process improvement: a local opinion and empirical evidence , 2003, 2003 International Symposium on Empirical Software Engineering, 2003. ISESE 2003. Proceedings..

[9]  Joachim Rossberg Pro Visual Studio Team System Application Lifecycle Management , 2008 .

[10]  Thomas D. LaToza,et al.  Maintaining mental models: a study of developer work habits , 2006, ICSE.

[11]  Lorenzo Strigini Limiting the Dangers of Intuitive Decision Making , 1996, IEEE Softw..

[12]  Qin Zhang,et al.  Improving software development management through software project telemetry , 2005, IEEE Software.

[13]  Mark Keil,et al.  Understanding software project risk: a cluster analysis , 2004, Inf. Manag..

[14]  Pradeep K. Tyagi The effects of appeals, anonymity, and feedback on mail survey response patterns from salespeople , 1989 .

[15]  Zhenchang Xing,et al.  What do developers search for on the web? , 2017, Empirical Software Engineering.

[16]  Christian Bird,et al.  Transition from centralized to decentralized version control systems: a case study on reasons, barriers, and outcomes , 2014, ICSE.

[17]  Terry L. Childers,et al.  Understanding mail survey response behavior: A meta-analysis. , 1991 .

[18]  Floyd J. Fowler,et al.  Improving Survey Questions: Design and Evaluation , 1995 .

[19]  Sandeep Chanda,et al.  Beginning ASP.NET 4.5 Databases , 2013, Apress.