Five-stage procedure for the evaluation of simulation models through statistical techniques

This paper recommends the following sequence for the evaluation of simulation models. 1) Validation: the availability of data on the real system determines the proper type of statistical technique. 2) Screening: in the simulation's pilot phase the important inputs are identified through a novel technique, namely sequential bifurcation, which uses aggregation and sequential experimentation. 3) Sensitivity or what-if analysis: the important inputs are analyzed in more detail, including interactions between inputs; relevant techniques are design of experiments (DOE) and regression analysis. 4) Uncertainty or risk analysis: important environmental inputs may have values not precisely known, so the resulting uncertainties in the model outputs are quantified; techniques are Monte Carlo and Latin hypercube sampling. 5) Optimization: policy variables may be controlled, applying Response Surface Methodology (RSM), which combines DOE, regression analysis, and steepest-ascent hill-climbing. This paper summarizes case studies for each stage.

[1]  Bernard P. Zeigler,et al.  Theory of Modelling and Simulation , 1979, IEEE Transactions on Systems, Man and Cybernetics.

[2]  Jack P. C. Kleijnen Risk analysis and sensitivity analysis: antithesis or synthesis? , 1983, SIML.

[3]  Michael D. McKay,et al.  Latin hypercube sampling as a tool in uncertainty analysis of computer models , 1992, WSC '92.

[4]  Ted Eschenbach,et al.  Spiderplots versus Tornado Diagrams for Sensitivity Analysis , 1992 .

[5]  Boudewijn R. Haverkort,et al.  Sensitivity and uncertainty analysis of Markov-reward models , 1992 .

[6]  Jack P. C. Kleijnen,et al.  EUROPEAN JOURNAL OF OPERATIONAL , 1992 .

[7]  Jack P. C. Kleijnen,et al.  Simulation: A Statistical Perspective , 1992 .

[8]  Jack P. C. Kleijnen,et al.  Sensitivity analysis versus uncertainty analysis: when to use what? in predictability and nonlinear , 1994 .

[9]  Michael D. McKay,et al.  Evaluating Prediction Uncertainty , 1995 .

[10]  Jack P. C. Kleijnen Sensitivity analysis and related analysis : A survey of statistical techniques , 1995 .

[11]  I. Sobol,et al.  About the use of rank transformation in sensitivity analysis of model output , 1995 .

[12]  Jack P. C. Kleijnen Sensitivity analysis and optimization in simulation: design of experiments and case studies , 1995, WSC '95.

[13]  David Draper,et al.  Assessment and Propagation of Model Uncertainty , 2011 .

[14]  Jean-Marie Fürbringer,et al.  Comparison and combination of factorial and Monte-Carlo design in sensitivity analysis , 1995 .

[15]  Jack P. C. Kleijnen,et al.  Sensitivity analysis and optimization of system dynamics models: Regression analysis and statistical design of experiments , 1995 .

[16]  Andrea Saltelli,et al.  Sensitivity analysis of model output. Performance of the iterated fractional factorial design method , 1995 .

[17]  James R. Wilson,et al.  Integrated Variance Reduction Strategies for Simulation , 1996, Oper. Res..

[18]  Jack P. C. Kleijnen,et al.  Optimization and Sensitivity Analysis of Computer Simulation Models by the Score Function Method , 1996 .

[19]  Robert L. Winkler,et al.  Uncertainty in probabilistic risk assessment , 1996 .

[20]  J. C. Helton,et al.  Uncertainty and sensitivity analysis in the presence of stochastic and subjective uncertainty , 1997 .

[21]  R. Cheng,et al.  Sensitivity of computer simulation experiments to errors in input data , 1997 .

[22]  Jack P. C. Kleijnen,et al.  Searching for important factors in simulation models with many factors: Sequential bifurcation , 1997 .

[23]  Terry Andres Sampling methods and sensitivity analysis for large parameter sets , 1997 .