Comparison of Seven Bug Report Types: A Case-Study of Google Chrome Browser Project

Bug reports submitted to an issue tracking system can belong to different categories such as crash, regression, security, cleanup, polish, performance and usability. A deeper understanding of the properties and features of various categories of bug reports can have implications in improving software maintenance processes, tools and practices. We identify several metrics and characteristics serving as dimensions on which various types of bug reports can be compared. We perform a case-study on Google Chromium Browser open-source project and conduct a series of experiments to calculate various metrics. We present a characterization study comparing different types of bug reports on metrics such as: statistics on close-time, number of stars, number of comments, discriminatory and frequent words for each class, entropy across reporters, entropy across component, opening and closing trend, continuity and debugging efficiency performance characteristics. The calculated metrics shows the similarities and differences on various dimensions for seven different types of bug reports.

[1]  Ashish Sureka,et al.  Detecting Duplicate Bug Report Using Character N-Gram-Based Features , 2010, 2010 Asia Pacific Software Engineering Conference.

[2]  Charles F. Hockett,et al.  A mathematical theory of communication , 1948, MOCO.

[3]  Chiara Francalanci,et al.  Empirical Analysis of the Bug Fixing Process in Open Source Projects , 2008, OSS.

[4]  Tao Xie,et al.  Identifying security bug reports via text mining: An industrial case study , 2010, 2010 7th IEEE Working Conference on Mining Software Repositories (MSR 2010).

[5]  Nicholas Jalbert,et al.  Automated duplicate detection for bug tracking systems , 2008, 2008 IEEE International Conference on Dependable Systems and Networks With FTCS and DCC (DSN).

[6]  David M. Nichols,et al.  Exploring Usability Discussions in Open Source Development , 2005, Proceedings of the 38th Annual Hawaii International Conference on System Sciences.

[7]  Lucas D. Panjer Predicting Eclipse Bug Lifetimes , 2007, Fourth International Workshop on Mining Software Repositories (MSR'07:ICSE Workshops 2007).

[8]  Foutse Khomh,et al.  An Entropy Evaluation Approach for Triaging Field Crashes: A Case Study of Mozilla Firefox , 2011, 2011 18th Working Conference on Reverse Engineering.

[9]  Ashish Sureka,et al.  Learning to Classify Bug Reports into Components , 2012, TOOLS.

[10]  David M. Nichols,et al.  Usability processes in open source projects , 2006, Softw. Process. Improv. Pract..

[11]  Ahmed E. Hassan,et al.  Security versus performance bugs: a case study on Firefox , 2011, MSR '11.

[12]  Foutse Khomh,et al.  Is it a bug or an enhancement?: a text-based approach to classify change requests , 2008, CASCON '08.

[13]  Sang Joon Kim,et al.  A Mathematical Theory of Communication , 2006 .

[14]  Gail C. Murphy,et al.  Who should fix this bug? , 2006, ICSE.

[15]  Dane Bertram,et al.  Communication, collaboration, and bugs: the social nature of issue tracking in small, collocated teams , 2010, CSCW '10.

[16]  Chen Wang,et al.  Open Source Software Adoption: A Status Report , 2001, IEEE Softw..

[17]  Michael W. Godfrey,et al.  A tale of two browsers , 2011, MSR '11.

[18]  Tao Xie,et al.  An approach to detecting duplicate bug reports using natural language and execution information , 2008, 2008 ACM/IEEE 30th International Conference on Software Engineering.

[19]  Ahmed E. Hassan,et al.  A qualitative study on performance bugs , 2012, 2012 9th IEEE Working Conference on Mining Software Repositories (MSR).