Perspectives on the Future of Software Engineering

Traditional engineering disciplines such as mechanical and electrical engineering are guided by physical laws. They provide the constraints for acceptable engineering solutions by enforcing regularity and thereby limiting complexity. Violations of physical laws can be experienced instantly in the lab. Software engineering is not constrained by physical laws. Consequently, we often create software artifacts that are too complex to be understood, tested, or maintained. As overly complex software solutions may even work initially, we are tempted to believe that no laws apply. We only learn about the violation of some form of “cognitive laws” late during development or during maintenance, when overly high complexity inflicts follow-up defects or increases maintenance costs. Innovative life cycle process models (e.g., the Spiral model) provide the basis for incremental risk evaluation and adjustment of such predictions. The proposal in this paper is to work towards a scientific basis for software engineering by capturing more such time-lagging dependencies among software artifacts in the form of empirical models and thereby making developers aware of so-called “cognitive laws” that must be adhered to. This paper attempts to answer the questions of why we need software engineering laws and what they might look like, how we have to organize our discipline in order to establish software engineering laws, which such laws already exist and how we could develop further laws, how such laws could contribute to the maturing of the science and engineering of software in the future, and what challenges remain for teaching, research, and practice in the future. D. Rombach ( ) University of Kaiserslautern & Fraunhofer IESE, 67663 Kaiserslautern, Germany e-mail: dieter.rombach@iese.fraunhofer.de J. Münch and K. Schmid (eds.), Perspectives on the Future of Software Engineering, DOI 10.1007/978-3-642-37395-4 12. The previous version of this paper was published in the International Journal of Software and Informatics, 2011,5(3):525–534© 2011 ISCAS-reprinted with permission 1

[1]  Giuseppe Visaggio,et al.  Evaluating Defect Detection Techniques for Software Requirements Inspections , 2000 .

[2]  Barry W. Boehm,et al.  An Evidence-based Systems Engineering (SE) Data Item Description , 2013, CSER.

[3]  Marcus Ciolkowski,et al.  An approach for quantitative aggregation of evidence from controlled experiments in software engineering , 2012 .

[4]  Kate Vitasek,et al.  The Vested Outsourcing Manual , 2011 .

[5]  Barry W. Boehm,et al.  Using the WinWin Spiral Model: A Case Study , 1998, Computer.

[7]  Per Runeson,et al.  Guidelines for conducting and reporting case study research in software engineering , 2009, Empirical Software Engineering.

[8]  Adam Trendowicz,et al.  Software Cost Estimation, Benchmarking, and Risk Assessment , 2013, The Fraunhofer IESE Series on Software and Systems Engineering.

[9]  Tore Dybå,et al.  The Future of Empirical Methods in Software Engineering Research , 2007, Future of Software Engineering (FOSE '07).

[10]  H. D. Rombach,et al.  THE EXPERIENCE FACTORY , 1999 .

[11]  George B. Dyson Darwin among the Machines: The Evolution of Global Intelligence , 1997 .

[12]  Daniela E. Damian,et al.  Selecting Empirical Methods for Software Engineering Research , 2008, Guide to Advanced Empirical Software Engineering.

[13]  D. Ross Jeffery,et al.  Has twenty-five years of empirical software engineering made a difference? , 2002, Ninth Asia-Pacific Software Engineering Conference, 2002..

[14]  Andreas Jedlitschka,et al.  Evaluating a model of software managers' information needs: an experiment , 2010, ESEM '10.

[15]  Gregory Howell,et al.  The Underlying Theory of Project Management Is Obsolete , 2008, IEEE Engineering Management Review.

[16]  Albert Endres,et al.  A handbook of software and systems engineering - empirical observations, laws and theories , 2003, The Fraunhofer IESE series on software engineering.

[17]  Stefan Biffl Analysis of the impact of reading technique and inspector capability on individual inspection performance , 2000, Proceedings Seventh Asia-Pacific Software Engeering Conference. APSEC 2000.

[18]  David R. Jones,et al.  Synthesising qualitative and quantitative evidence: A review of possible methods , 2005 .

[19]  R. Kurzweil The age of spiritual machines: when computers exceed human intelligence , 1998 .

[20]  H. Dieter Rombach,et al.  Empirical Software Engineering Models: Can They Become the Equivalent of Physical Laws in Traditional Engineering? , 2011, Int. J. Softw. Informatics.

[21]  J. Collins Good to great , 2001, The Game Changer.

[22]  Stefan Biffl,et al.  Software Reviews: The State of the Practice , 2003, IEEE Softw..

[23]  Joy Bill,et al.  Why the future doesn’t need us , 2003 .

[24]  Barry W. Boehm,et al.  Anchoring the Software Process , 1996, IEEE Softw..

[25]  Art Pyster,et al.  Software Engineering 2009(GSwE2009): Curriculum Guidelines for Graduate Degree Programs in Software Engineering , 2009 .

[26]  J. Reginster,et al.  From Sample Size to Effect-Size: Small Study Effect Investigation (SSEi) , 2003 .

[27]  Holger Storf,et al.  An Approach to and Evaluations of Assisted Living Systems Using Ambient Intelligence for Emergency Monitoring and Prevention , 2009, HCI.

[28]  E. Guba,et al.  Lincoln, Yvonna, and Egon Guba, "Postpositivism and the Naturalist Paradigm," pp. 14-46 in Yvonna Lincoln and Egon Guba, Naturalistic Inquiry . Beverly Hills, CA: Sage, 1985.* , 1985 .

[29]  Constanza Lampasona,et al.  Seamless integration of order processing in MS Outlook using SmartOffice: An empirical evaluation , 2012, Proceedings of the 2012 ACM-IEEE International Symposium on Empirical Software Engineering and Measurement.

[30]  Tore Dybå,et al.  Building Theories in Software Engineering , 2008, Guide to Advanced Empirical Software Engineering.

[31]  Barry W. Boehm,et al.  Balancing Opportunities and Risks in Component-Based Software Development , 2008, IEEE Software.

[32]  Hakan Erdogmus How Important Is Evidence, Really? , 2010, IEEE Softw..

[33]  Marcus Ciolkowski,et al.  Relevant Information Sources for Successful Technology Transfer: A Survey Using Inspections as an Example , 2007, ESEM 2007.

[34]  Liming Zhu,et al.  Large-scale formal verification in practice: A process perspective , 2012, 2012 34th International Conference on Software Engineering (ICSE).

[35]  Philippe Kruchten,et al.  The Rational Unified Process Made Easy - A Practitioner's Guide to the RUP , 2003, Addison Wesley object technology series.

[36]  Bill Curtis,et al.  The People Capability Maturity Model , 2001 .

[37]  Guilherme Horta Travassos,et al.  Evidence-Based Guidelines to Defect Causal Analysis , 2012, IEEE Software.

[38]  W. Shadish,et al.  Experimental and Quasi-Experimental Designs for Generalized Causal Inference , 2001 .

[39]  Neil A. M. Maiden,et al.  ACRE: selecting methods for requirements acquisition , 1996, Softw. Eng. J..

[40]  Henry W. Chesbrough,et al.  Open innovation : ハーバード流イノベーション戦略のすべて , 2004 .

[41]  Barry W. Boehm,et al.  Principles for Successful Systems Engineering , 2012, CSER.

[42]  Daniela Cruzes,et al.  Recommended Steps for Thematic Synthesis in Software Engineering , 2011, 2011 International Symposium on Empirical Software Engineering and Measurement.

[43]  Pearl Brereton,et al.  Systematic literature reviews in software engineering - A systematic literature review , 2009, Inf. Softw. Technol..

[44]  Filippo Lanubile,et al.  Assessing the impact of active guidance for defect detection: a replicated experiment , 2004 .

[45]  Christian Webel,et al.  A Preliminary Survey on Subjective Measurements and Personal Insights into Factors of Perceived Future Project Success , 2011, 2011 International Symposium on Empirical Software Engineering and Measurement.

[46]  Victor R. Basili,et al.  Linking Software Development and Business Strategy Through Measurement , 2010, Computer.

[47]  H. Dieter Rombach,et al.  Experimentation as a vehicle for software technology transfer-A family of software reading techniques , 1997, Inf. Softw. Technol..

[48]  Sabaliauskaite Giedre,et al.  Investigating defect detection in object-oriented design and cost-effectiveness of software inspection , 2004 .

[49]  Karl Manrodt,et al.  Changing the Game: The Rise of Vested Outsourcing , 2010 .

[50]  V. Bacharach,et al.  Psychometrics : An Introduction , 2007 .

[51]  Leonard J. Bass,et al.  Formal specifications better than function points for code sizing , 2013, 2013 35th International Conference on Software Engineering (ICSE).

[52]  D. Ross Jeffery,et al.  Cost estimation for web applications , 2003, 25th International Conference on Software Engineering, 2003. Proceedings..

[53]  Dietmar Pfahl,et al.  Experience-based model-driven improvement management with combined data sources from industry and academia , 2003, 2003 International Symposium on Empirical Software Engineering, 2003. ISESE 2003. Proceedings..

[54]  Philippe Kruchten,et al.  What Is the Rational Unified Process ? , 2001 .

[55]  David M. Weiss,et al.  Architecture reviews: practice and experience , 2005, IEEE Software.

[56]  Reinhold Plösch,et al.  The Quamoco product quality modelling and assessment approach , 2012, 2012 34th International Conference on Software Engineering (ICSE).

[57]  Stefan Biffl,et al.  Investigating the accuracy of defect estimation models for individuals and teams based on inspection data , 2003, 2003 International Symposium on Empirical Software Engineering, 2003. ISESE 2003. Proceedings..

[58]  Peter H. Rossi,et al.  Evaluating With Sense , 1983 .

[59]  Oliver Laitenberger,et al.  Perspective-based reading of code documents at Robert Bosch GmbH , 1997, Inf. Softw. Technol..

[60]  Andreas Jedlitschka,et al.  An empirical model of software managers' information needs for software engineering technology selection: a framework to support experimentally-based software engineering technology selection , 2009 .

[61]  Michael Norrish,et al.  seL4: formal verification of an OS kernel , 2009, SOSP '09.

[62]  Frank Bomarius,et al.  COBRA: a hybrid method for software cost estimation, benchmarking, and risk assessment , 1998, Proceedings of the 20th International Conference on Software Engineering.

[63]  H. Dreyfus Mind Over Machine , 1986 .

[64]  Claes Wohlin,et al.  Capture-recapture in software inspections after 10 years research--theory, evaluation and application , 2004, J. Syst. Softw..

[65]  Barry Boehm,et al.  Using the Incremental Commitment Model to Integrate System Acquisition, Systems Engineering, and Software Engineering , 2007 .

[66]  Peter R. Harris Designing and reporting experiments , 1986 .

[67]  K. Perreault,et al.  Research Design: Qualitative, Quantitative, and Mixed Methods Approaches , 2011 .

[68]  Nitin Nohria,et al.  What really works. , 2003, Harvard business review.

[69]  Andrew Dillon,et al.  User acceptance of information technology , 2001 .

[70]  Fred P. Brooks,et al.  The Mythical Man-Month , 1975, Reliable Software.

[71]  Victoria J. Hodge,et al.  A Survey of Outlier Detection Methodologies , 2004, Artificial Intelligence Review.

[72]  Dirk Hamann,et al.  Adapting PROFES for Use in an Agile Process: An Industry Experience Report , 2005, PROFES.

[73]  Filippo Lanubile,et al.  Investigating the active guidance factor in reading techniques for defect detection , 2004, Proceedings. 2004 International Symposium on Empirical Software Engineering, 2004. ISESE '04..

[74]  D. Ross Jeffery,et al.  Qualitative simulation model for software engineering process , 2006, Australian Software Engineering Conference (ASWEC'06).

[75]  J. Whitney Case Study Research , 1999 .

[76]  K. Eric Drexler,et al.  Engines of Creation , 1986 .

[77]  J. Popay,et al.  Synthesizing Qualitative and Quantitative Health Evidence: A Guide to Methods , 2007 .

[78]  Tore Dybå,et al.  A Systematic Review of Theory Use in Software Engineering Experiments , 2007, IEEE Transactions on Software Engineering.

[79]  Rocco J. Perla,et al.  Ten Common Misunderstandings, Misconceptions, Persistent Myths and Urban Legends about Likert Scales and Likert Response Formats and their Antidotes , 2007 .

[80]  Walter Tichy Empirical software research: an interview with Dag Sjøberg, University of Oslo, Norway , 2011, UBIQ.

[81]  Chris Peterson,et al.  Unbounding the Future: The Nanotechnology Revolution , 1991 .

[82]  Norman L. Kerth,et al.  Project Retrospectives: A Handbook for Team Reviews , 2001 .

[83]  Frank Elberzhager,et al.  Support planning and controlling of early quality assurance by combining expert judgment and defect data—a case study , 2010, Empirical Software Engineering.

[84]  Daniel Roos,et al.  The machine that changed the world : the story of lean production , 1991 .

[85]  Jeffrey C. Carver Towards Reporting Guidelines for Experimental Replications: A Proposal , 2010 .

[86]  H. Dieter Rombach,et al.  Research Collaborations between Academia and Industry , 2007, Future of Software Engineering (FOSE '07).

[87]  Walter Tichy Empirical software research: an interview with Dag Sjøberg, University of Oslo, Norway , 2013, UBIQ.

[88]  Jason W. Osborne,et al.  The power of outliers (and why researchers should ALWAYS check for them) , 2004 .

[89]  Colin Atkinson,et al.  An experimental comparison of reading techniques for defect detection in UML design documents , 2000, J. Syst. Softw..

[90]  Viswanath Venkatesh,et al.  Technology Acceptance Model 3 and a Research Agenda on Interventions , 2008, Decis. Sci..