Evaluating the effect of a delegated versus centralized control style on the maintainability of object-oriented software

A fundamental question in object-oriented design is how to design maintainable software. According to expert opinion, a delegated control style, typically a result of responsibility-driven design, represents object-oriented design at its best, whereas a centralized control style is reminiscent of a procedural solution, or a "bad" object-oriented design. We present a controlled experiment that investigates these claims empirically. A total of 99 junior, intermediate, and senior professional consultants from several international consultancy companies were hired for one day to participate in the experiment. To compare differences between (categories of) professionals and students, 59 students also participated. The subjects used professional Java tools to perform several change tasks on two alternative Java designs that had a centralized and delegated control style, respectively. The results show that the most skilled developers, in particular, the senior consultants, require less time to maintain software with a delegated control style than with a centralized control style. However, more novice developers, in particular, the undergraduate students and junior consultants, have serious problems understanding a delegated control style, and perform far better with a centralized control style. Thus, the maintainability of object-oriented software depends, to a large extent, on the skill of the developers who are going to maintain it. These results may have serious implications for object-oriented development in an industrial context: having senior consultants design object-oriented systems may eventually pose difficulties unless they make an effort to keep the designs simple, as the cognitive complexity of "expert" designs might be unmanageable for less skilled maintainers.

[1]  Magne Jørgensen,et al.  Assessing the Changeability of two Object-Oriented Design Alternatives--a Controlled Experiment , 2001, Empirical Software Engineering.

[2]  Barbara Ann Kitchenham,et al.  Evaluating software engineering methods and tool part 1: The evaluation context and evaluation methods , 1996, SOEN.

[3]  Ivar Jacobson,et al.  The Unified Software Development Process , 1999 .

[4]  Barbara Ann Kitchenham Evaluating software engineering methods and tool—part 2: selecting an appropriate evaluation method—technical criteria , 1996, SOEN.

[5]  Lionel C. Briand,et al.  A Controlled Experiment for Evaluating Quality Guidelines on the Maintainability of Object-Oriented Designs , 2001, IEEE Trans. Software Eng..

[6]  Stephen J. Mellor,et al.  Object Oriented Systems Analysis: Modeling the World in Data , 1988 .

[7]  Chris F. Kemerer,et al.  A Metrics Suite for Object Oriented Design , 2015, IEEE Trans. Software Eng..

[8]  Kent L. Beck,et al.  A laboratory for teaching object oriented thinking , 1989, OOPSLA '89.

[9]  A. Ehrenberg,et al.  The Design of Replicated Studies , 1993 .

[10]  Lionel C. Briand,et al.  A Unified Framework for Cohesion Measurement , 1997, IEEE METRICS.

[11]  Lionel C. Briand,et al.  A Unified Framework for Cohesion Measurement in Object-Oriented Systems , 2004, Empirical Software Engineering.

[12]  Ivar Jacobson,et al.  Object-Oriented Software Engineering , 1991, TOOLS.

[13]  Lionel C. Briand,et al.  Empirical Studies of Quality Models in Object-Oriented Systems , 2002, Adv. Comput..

[14]  Robert C. Sharble,et al.  The object-oriented brewery: a comparison of two object-oriented development methods , 1993, SOEN.

[15]  Steven D. Sheetz,et al.  Identifying the difficulties of object-oriented development , 2002, J. Syst. Softw..

[16]  Karl J. Lieberherr,et al.  Assuring good style for object-oriented programs , 1989, IEEE Software.

[17]  D. Campbell,et al.  EXPERIMENTAL AND QUASI-EXPERIMENT Al DESIGNS FOR RESEARCH , 2012 .

[18]  Ronald Christensen,et al.  Analysis of Variance, Design, and Regression: Applied Statistical Methods , 1996 .

[19]  Karl J. Lieberherr,et al.  Object-oriented design , 1996, CSUR.

[20]  B. Kitchenham,et al.  Case Studies for Method and Tool Evaluation , 1995, IEEE Softw..

[21]  Rudolf K. Keller,et al.  Object-oriented design quality , 1997, OOPSLA '97.

[22]  Tore Dybå,et al.  Conducting realistic experiments in software engineering , 2002, Proceedings International Symposium on Empirical Software Engineering.

[23]  Robert L. Glass,et al.  The software-research crisis , 1994, IEEE Software.

[24]  Lionel C. Briand,et al.  An Experimental Comparison of the Maintainability of Object-Oriented and Structured Design Documents , 2004, Empirical Software Engineering.

[25]  Lionel C. Briand,et al.  A Unified Framework for Coupling Measurement in Object-Oriented Systems , 1999, IEEE Trans. Software Eng..

[26]  Rebecca Wirfs-Brock,et al.  Object-oriented design: a responsibility-driven approach , 1989, OOPSLA '89.

[27]  Françoise Détienne,et al.  Object-Oriented Program Comprehension: Effect of Expertise, Task and Phase , 2002, Empirical Software Engineering.

[28]  Yngve Lindsjørn,et al.  A Web-Based Support Environment for Software Engineering Experiments , 2002, Nord. J. Comput..

[29]  Rebecca J. Wirfs-Brock Characterizing Your Application’s Control Style , 2004 .

[30]  Rebecca Wirfs-Brock,et al.  Designing object-oriented software , 1990 .

[31]  Tore Dybå,et al.  Challenges and Recommendations When Increasing the Realism of Controlled Software Engineering Experiments , 2003, ESERNET.

[32]  Colin Potts,et al.  Software-engineering research revisited , 1993, IEEE Software.

[33]  Lingfeng Wang,et al.  ObjectOriented Software Engineering , 2006 .