Exploring the Robustness of Risk Reduction Strategies

Risks permeate the development and operation of many complex software systems. Nearly all risk reduction activities incur cost, and the sum total cost of all potentially applicable activities typically far exceeds the resources available. Hence there is the need to judiciously pick from among those activities to arrive at a costeffective selection. We call the approach to make this selection a “risk reduction strategy”. A risk reduction strategy could be as simple as a fixed set of activities, the same set applied to all software systems (the effort to perform these will typically depend on the size and complexity of the software). A refinement of this is to have an escalating series of such fixed sets, increasing in thoroughness (and cost), and pick the one to use based on an initial assessment of the system’s criticality and risk. This is the approach followed by NASA’s IV&V facility: “The results of CA [Criticality Analysis] are used to determine the set of tasks to be performed on each software component and to focus the emphasis and intensity of the IV&V effort on the most important areas” [1]. In a similar vein, the “Ask Pete” software developed at NASA Glenn guided software quality engineers in planning their software quality activities [2]. Yet more sophisticated selection strategies have been proposed that hinge upon models that estimate software-specific risk prevalence and the cost and effectiveness of the available activities at reducing risks. Instances of these include: • COQUALMO (Constructive QUALity Model), which “enables 'what-if' analyses that demonstrate the impact of various defect removal techniques and the effects of personnel, project, product and platform characteristics on software quality” [3] • PATT (Process Analysis Tradeoff Tool), which “can be used to quantify the costs and benefits associated with NASA process decisions and specifically IV&V practices enabling management to effectively allocate scarce resources for IV&V activities” [4] The authors of this abstract each have been involved in the development of such models with a focus on both the models themselves, and the strategy for arriving at a cost-effective risk reduction activities: • Julian Richardson has been involved in XOMO, which coupled Monte Carlo simulation and data mining to determine actions which can be taken during project development to improve project outcome, for example reducing cost and/or risk [5] • Daniel Port has been involved in the “Strategic Method”, a technique for generating Pareto risk reduction strategies, that most reduce risk exposure at lowest cost [6] • Martin Feather has been involved in DDP, a risk tool that employs simulated annealing to search for near optimal risk reduction strategies [7].

[1]  Martin S. Feather,et al.  Quantitative risk-based requirements reasoning , 2003, Requirements Engineering.

[2]  Tim Menzies,et al.  XOMO: Understanding Development Options for Autonomy , 2005 .

[3]  R. Kazman,et al.  Practicing What is Preached: 80-20 Rules for Strategic IV&V Assessment , 2007, 2007 IEEE International Conference on Exploring Quantifiable IT Yields.