Bayesian calibration of a numerical code for prediction

Field experiments are often difficult and expensive to make. To bypass these issues, industrial companies have developed computational codes. These codes intend to be representative of the physical system, but come with a certain amount of problems. Code validation is representative of one of these issues, related to the fact that the code intends to be as close as possible to the physical system. It turns out that, despite continuous code development, the difference between code output and experiments can remain significant. Two kinds of uncertainties are observed. The first comes from the difference between the physical phenomenon and the values recorded experimentally which is often represented by a white Gaussian noise. The second concerns the gap between the code and the physical system. To reduce this difference, often named model bias, or model error, computer codes are generally complexified in order to make them more realistic. These improvements lead to time consuming codes. Moreover, a code often depends on parameters to be set by the user to make the code as close as possible to field data. This estimation task is called calibration and can be performed with a time consuming or a fast code with or without model discrepancy. This paper provides a review of the main calibration methods developed to date. An application case will be used to illustrate the decisions taken along the article and to discuss divergence points. This example, motivated by an industrial and financial context, uses a code which predicts the power from a photovoltaic plant and will be used in a prevision context.

[1]  N. Metropolis,et al.  Equation of State Calculations by Fast Computing Machines , 1953, Resonance.

[2]  W. K. Hastings,et al.  Monte Carlo Sampling Methods Using Markov Chains and Their Applications , 1970 .

[3]  Max D. Morris,et al.  Factorial sampling plans for preliminary computational experiments , 1991 .

[4]  T. J. Mitchell,et al.  Bayesian Prediction of Deterministic Functions, with Applications to the Design and Analysis of Computer Experiments , 1991 .

[5]  T. J. Mitchell,et al.  Exploratory designs for computational experiments , 1995 .

[6]  P. Roache Verification of Codes and Calculations , 1998 .

[7]  William L. Oberkampf,et al.  Guide for the verification and validation of computational fluid dynamics simulations , 1998 .

[8]  Donald R. Jones,et al.  Efficient Global Optimization of Expensive Black-Box Functions , 1998, J. Glob. Optim..

[9]  Jeong-Soo Park,et al.  A statistical method for tuning a computer code to a data base , 2001 .

[10]  Michael Goldstein,et al.  Bayesian Forecasting for Complex Systems Using Computer Simulators , 2001 .

[11]  A. O'Hagan,et al.  Bayesian calibration of computer models , 2001 .

[12]  N. Martín,et al.  Calculation of the PV modules angular losses under field conditions by means of an analytical model , 2002 .

[13]  Thomas J. Santner,et al.  Design and analysis of computer experiments , 1998 .

[14]  Thomas J. Santner,et al.  The Design and Analysis of Computer Experiments , 2003, Springer Series in Statistics.

[15]  Dave Higdon,et al.  Combining Field Data and Computer Simulations for Calibration and Prediction , 2005, SIAM J. Sci. Comput..

[16]  Runze Li,et al.  Design and Modeling for Computer Experiments , 2005 .

[17]  A. O'Hagan,et al.  Supplementary details on Bayesian Calibration of Computer Models , 2006 .

[18]  James O. Berger,et al.  A Framework for Validation of Computer Models , 2007, Technometrics.

[19]  D. Higdon,et al.  Computer Model Calibration Using High-Dimensional Output , 2008 .

[20]  E. de Rocquigny,et al.  Quantifying Uncertainty in an Industrial Approach: an Emerging Consensus in an Old Epistemological Debate , 2009 .

[21]  James O. Berger,et al.  Modularization in Bayesian analysis, with emphasis on analysis of computer models , 2009 .

[22]  Judith Rousseau,et al.  Combining expert opinions in prior elicitation , 2010 .

[23]  Antonio Luque,et al.  Handbook of photovoltaic science and engineering , 2011 .

[24]  Luc Pronzato,et al.  Design of computer experiments: space filling and beyond , 2011, Statistics and Computing.

[25]  Yves Deville,et al.  DiceKriging, DiceOptim: Two R Packages for the Analysis of Computer Experiments by Kriging-Based Metamodeling and Optimization , 2012 .

[26]  Sébastien Da Veiga,et al.  Gaussian process modeling with inequality constraints , 2012 .

[27]  Franccois Bachoc,et al.  Calibration and Improved Prediction of Computer Models by Universal Kriging , 2013, 1301.4114.

[28]  Curtis B. Storlie,et al.  A frequentist approach to computer model calibration , 2014, 1411.4723.

[29]  P. D. Moral,et al.  Méthodes de Monte Carlo par Chaînes de Markov (MCMC) , 2014 .

[30]  Jenný Brynjarsdóttir,et al.  Learning about physical parameters: the importance of model discrepancy , 2014 .

[31]  C. F. Wu,et al.  Efficient Calibration for Imperfect Computer Models , 2015, 1507.07280.

[32]  Merlin Keller,et al.  Bayesian Model Selection for the Validation of Computer Codes , 2016, Qual. Reliab. Eng. Int..

[33]  Angelika Bayer,et al.  Solar Engineering Of Thermal Processes , 2016 .

[34]  Rui Tuo,et al.  A Theoretical Framework for Calibration in Computer Models: Parametrization, Estimation and Convergence Properties , 2015, SIAM/ASA J. Uncertain. Quantification.

[35]  Long Wang,et al.  An improved approach to Bayesian computer model calibration and prediction , 2017 .

[36]  M. Plumlee Bayesian Calibration of Inexact Computer Models , 2017 .

[37]  Bertrand Iooss,et al.  Title: Open TURNS: An industrial software for uncertainty quantification in simulation , 2015, 1501.05242.

[38]  Merlin Keller,et al.  Adaptive Numerical Designs for the Calibration of Computer Codes , 2015, SIAM/ASA J. Uncertain. Quantification.