Communication Metrics for Software Development

We present empirical evidence that metrics on communication artifacts generated by groupware tools can be used to gain significant insight into the development process that produced them. We describe a test-bed for developing and testing communication metrics, a senior level software engineering project course at Carnegie Mellon University, in which we conducted several studies and experiments from 1991-1996 with more than 400 participants. Such a test-bed is an ideal environment for empirical software engineering, providing sufficient realism while allowing for controlled observation of important project parameters. We describe three proof-of-concept experiments to illustrate the value of communication metrics in software development projects. Finally, we propose a statistical framework based on structural equations for validating these communication metrics. Index Terms—Empirical software engineering, communication, statistics, structural equations. ——————————F—————————— 1I NTRODUCTION ETRICS applied to software artifacts have proven to be useful in measuring the impact of a tool or method in the context of a software project. In some cases, they enabled specific problems in the engineering processes to be identi- fied, thus demonstrating the value of metrics and other in- strumentation for process improvement. However, software code is only one of the many artifacts produced during soft- ware development. Moreover, it is often available only late in the process. The infrastructure and the resources needed to collect metrics on code are also non-negligible, and often prevents their use for identifying problems as they occur. Communication artifacts, such as electronic mail, memo- randa, or records generated by groupware tools, represent a different perspective on the development process. They are available throughout the project, they capture information about a more comprehensive set of artifacts (e.g., code, pro- cess, organization, politics, morale), and their form is inde- pendent of implementation technology, development infra- structure, or even the existence of a product. Metrics ap- plied to such communication artifacts can, therefore, pro- vide significant insight into the development process that produced them. In this paper, we discuss the design and evaluation of a set of communication metrics for software development. Our goal is to develop metrics that enable the assessment of a tool or a method in the context of a project. Our long-term goal is to provide metrics that help identifying problems as they occur. In Section 2, we illustrate possible uses of these metrics with an example.

[1]  K. Jöreskog A General Method for Estimating a Linear Structural Equation System. , 1970 .

[2]  John E. Gaffney,et al.  Software Function, Source Lines of Code, and Development Effort Prediction: A Software Science Validation , 1983, IEEE Transactions on Software Engineering.

[3]  Ivar Jacobson,et al.  Object-Oriented Software Engineering , 1991, TOOLS.

[4]  James E. Rumbaugh,et al.  Getting Started: Using Use Cases to Capture Requirements , 1994, J. Object Oriented Program..

[5]  William E. Lorensen,et al.  Object-Oriented Modeling and Design , 1991, TOOLS.

[6]  Victor R. Basili,et al.  An Empirical Study of Communication in Code Inspections , 1997, Proceedings of the (19th) International Conference on Software Engineering.

[7]  Bernd Brügge,et al.  Object-oriented system modeling with OMT , 1992, OOPSLA.

[8]  Robert F. Coyne,et al.  Teaching More Comprehensive Model-Based Software Engineering: Experience with Objectory's Use Case Approach , 1995, CSEE.

[9]  Norman E. Fenton,et al.  Software Metrics: A Rigorous Approach , 1991 .

[10]  Anas N. Al-Rabadi,et al.  A comparison of modified reconstructability analysis and Ashenhurst‐Curtis decomposition of Boolean functions , 2004 .

[11]  Dewayne E. Perry,et al.  People, organizations, and process improvement , 1994, IEEE Software.

[12]  K. C. Burgess Yakemovic,et al.  Report on a development project use of an issue-based information system , 1990, CSCW '90.

[13]  William C. Hetzel The sorry state of software practice measurement and evaluation , 1995, J. Syst. Softw..

[14]  R. B. Rowen,et al.  Software project management under incomplete and ambiguous specifications , 1990 .

[15]  V. Stavridou,et al.  Abstraction and specification in program development , 1988 .

[16]  Motoshi Saeki Communication, collaboration and cooperation in software development-how should we support group work in software development? , 1995, Proceedings 1995 Asia Pacific Software Engineering Conference.

[17]  Neal S. Coulter,et al.  An Evolutionary Perspective of Software Engineering Research Through Co-Word Analysis , 1996 .

[18]  Victor R. Basili,et al.  An Empirical Study of a Syntactic Complexity Family , 1983, IEEE Transactions on Software Engineering.

[19]  Bill Curtis,et al.  A field study of the software design process for large systems , 1988, CACM.

[20]  Yoram Reich,et al.  COMPUTATIONAL SUPPORT FOR SHARED MEMORY IN DESIGN , 1994 .

[21]  M. Scriven The methodology of evaluation , 1966 .

[22]  Tom DeMarco,et al.  Controlling Software Projects: Management, Measurement, and Estimates , 1986 .

[23]  Tom DeMarco,et al.  Controlling software projects : management, measurement & estimation , 1982 .

[24]  B. Bruegge,et al.  Using an issue-based model in a team-based software engineering course , 1996, Proceedings 1996 International Conference Software Engineering: Education and Practice.

[25]  Allen Henry Dutoit The role of communication in team-based software engineering projects , 1996 .

[26]  Robert B. Grady,et al.  Software Metrics: Establishing a Company-Wide Program , 1987 .

[27]  Maurice H. Halstead,et al.  Elements of software science , 1977 .

[28]  M. Itakura,et al.  A model for estimating program size and its evaluation , 1982, ICSE '82.