Early Identification of SE-Related Program Risks: Opportunities for DoD Systems Engineering (SE) Transformation via SE Effectiveness Measures (EMs) and Evidence-Based Reviews

Abstract : DoD programs need effective systems engineering (SE) to succeed. DoD program managers need early warning of any risks to achieving effective SE. This SERC project has synthesized analyses of DoD SE effectiveness risk sources into a lean framework and toolset for early identification of SE-related program risks. Three important points need to be made about these risks. * They are generally not indicators of bad SE. Although SE can be done badly, more often the risks are consequences of inadequate program funding (SE is the first victim of an underbudgeted program), of misguided contract provisions (when a program manager is faced with the choice between allocating limited SE resources toward producing contract-incentivized functional specifications vs. addressing key performance parameter risks, the path of least resistance is to obey the contract), or of management temptations to show early progress on the easy parts while deferring the hard parts till later. * Analyses have shown that unaddressed risk generally leads to serious budget and schedule overruns. * Risks are not necessarily bad. If an early capability is needed, and the risky solution has been shown to be superior to the alternatives, accepting and focusing on mitigating the risk is generally better than waiting for a better alternative to show up. Unlike traditional schedule-based and event-based reviews, the SERC SE EM technology enables sponsors and performers to agree on the nature and use of more effective evidence-based reviews. These enable early detection of missing SE capabilities or personnel competencies with respect to a framework of Goals, Critical Success Factors (CSFs), and Questions determined by the EM task from the leading DoD early-SE CSF analyses. The EM tools enable risk-based prioritization of corrective actions, as shortfalls in evidence for each question are early uncertainties.

[1]  Barry Boehm,et al.  Detecting model clashes during software systems development , 2003 .

[2]  Paul J. Componation,et al.  ASSESSING THE RELATIONSHIPS BETWEEN PROJECT SUCCESS , AND SYSTEM ENGINEERING PROCESSES , 2008 .

[3]  Vladan Devedzic,et al.  Software Project Management , 2001 .

[4]  Barry Boehm,et al.  The ROI of systems engineering: Some quantitative results for software-intensive systems , 2008 .

[5]  Richard W. Pew,et al.  Human-system integration in the system development process : a new look , 2007 .

[6]  Walker Royce,et al.  The Economics of Iterative Software Development: Steering Toward Better Business Results , 2009 .

[7]  Barry W. Boehm,et al.  Anchoring the Software Process , 1996, IEEE Softw..

[8]  Khaled El Emam,et al.  A Survey of Systems Engineering Effectiveness - Initial Results (with detailed survey response data) , 2008 .

[9]  Ellis Horowitz,et al.  Software Cost Estimation with COCOMO II , 2000 .

[10]  Andraž Cej,et al.  Agile software development with Scrum , 2010 .

[11]  Paul Clements,et al.  Models for Evaluating and Improving Architecture Competence , 2008 .

[12]  K. Beck,et al.  Extreme Programming Explained , 2002 .

[13]  Donna H. Rhodes,et al.  SYSTEMS ENGINEERING LEADING INDICATORS GUIDE , 2007 .

[14]  H. D. Rombach,et al.  THE EXPERIENCE FACTORY , 1999 .

[15]  David M. Weiss,et al.  Architecture reviews: practice and experience , 2005, IEEE Software.

[16]  Barry Boehm,et al.  Avoiding the Software Model-Clash Spiderweb , 2000, Computer.