Caveats and Pitfalls of Production Forecast Uncertainty Analysis Using Design of Experiments

Design of Experiments (DoE) is one of the most commonly employed techniques in the petroleum industry for Assisted History Matching (AHM) and uncertainty analysis of reservoir production forecasts. Although conceptually straightforward, DoE is often misused by practitioners because many of its statistical and modeling principles are not carefully followed. Our earlier paper (Li et al. 2019) detailed the best practices in DoE-based AHM for brownfields. However, to our best knowledge, there is a lack of studies that summarize the common caveats and pitfalls in DoE-based production forecast uncertainty analysis for greenfields and history-matched brownfields. Our objective here is to summarize these caveats and pitfalls to help practitioners apply the correct principles for DoE-based production forecast uncertainty analysis. Over 60 common pitfalls in all stages of a DoE workflow are summarized. Special attention is paid to the following critical project transitions: (1) the transition from static earth modeling to dynamic reservoir simulation; (2) from AHM to production forecast; and (3) from analyzing subsurface uncertainties to analyzing field-development alternatives. Most pitfalls can be avoided by consistently following the statistical and modeling principles. Some pitfalls, however, can trap experienced engineers. For example, mistakes made in handling the three abovementioned transitions can yield strongly unreliable proxy and sensitivity analysis. For the representative examples we study, they can lead to having a proxy R2 of less than 0.2 versus larger than 0.9 if done correctly. Two improved experimental designs are created to resolve this challenge. Besides the technical pitfalls that are avoidable via robust statistical workflows, we also highlight the often more severe non-technical pitfalls that cannot be evaluated by measures like R2. Thoughts are shared on how they can be avoided, especially during project framing and the three critical transition scenarios.

[1]  Clayton V. Deutsch,et al.  GSLIB: Geostatistical Software Library and User's Guide , 1993 .

[2]  W. J. Studden,et al.  Theory Of Optimal Experiments , 1972 .

[3]  Albert C. Reynolds,et al.  Results of the Brugge Benchmark Study for Flooding Optimization and History Matching , 2010 .

[4]  Daniela M. Witten,et al.  An Introduction to Statistical Learning: with Applications in R , 2013 .

[5]  Burak Yeten,et al.  A Comparison Study on Experimental Design and Response Surface Methodologies , 2005 .

[6]  P. C. Shah,et al.  Reservoir History Matching by Bayesian Estimation , 1976 .

[7]  B. Li,et al.  Novel Multiple Resolutions Design of Experiment/Response Surface Methodology for Uncertainty Analysis of Reservoir Simulation Forecasts , 2005 .

[8]  Jorn F. M. Van Doren,et al.  A Comprehensive Workflow for Assisted History Matching Applied to a Complex Mature Reservoir , 2012 .

[9]  Gregory R. King,et al.  Probabilistic Forecasting for Mature Fields with Significant Production History: A Nemba Field Case Study , 2005 .

[10]  Behnam Jafarpour,et al.  Reservoir Characterization With the Discrete Cosine Transform , 2009 .

[11]  Jef Caers,et al.  Uncertainty Quantification in Reservoir Performance Using Distances and Kernel Methods--Application to a West Africa Deepwater Turbidite Reservoir , 2009 .

[12]  Jon C. Helton,et al.  Survey of sampling-based methods for uncertainty and sensitivity analysis , 2006, Reliab. Eng. Syst. Saf..

[13]  Aurélien Géron,et al.  Hands-On Machine Learning with Scikit-Learn and TensorFlow: Concepts, Tools, and Techniques to Build Intelligent Systems , 2017 .

[14]  S. R. Schmidt,et al.  Understanding industrial designed experiments , 1992 .

[15]  Jiang Xie,et al.  Selecting Representative Models From a Large Set of Models , 2013, ANSS 2013.

[16]  Jef Caers,et al.  Quantifying Asymmetric Parameter Interactions in Sensitivity Analysis: Application to Reservoir Modeling , 2014, Mathematical Geosciences.

[17]  J. Kikani,et al.  A033 PRACTICAL METHODS FOR UNCERTAINTY ASSESSMENT OF FLOW PREDICTIONS FOR RESERVOIRS WITH SIGNIFICANT HISTORY -AF IELD CASE STUDY , 2004 .

[18]  Christopher D. White,et al.  Experimental Design as a Framework for Reservoir Studies , 2003 .

[19]  John Doherty,et al.  Ground Water Model Calibration Using Pilot Points and Regularization , 2003, Ground water.

[20]  Jiang Xie,et al.  Identification of “Big Hitters" with Global Sensitivity Analysis for Improved Decision Making Under Uncertainty , 2015, ANSS 2015.

[21]  Djuro Novakovic,et al.  Improving Production Forecasts Through the Application of Design of Experiments and Probabilistic Analysis: A Case Study From Chevron, Nigeria , 2009 .

[22]  Thomas Schaaf,et al.  Using Experimental Designs, Assisted History-Matching Tools, and Bayesian Framework To Get Probabilistic Gas-Storage Pressure Forecasts , 2009 .

[23]  M. G. Marietta,et al.  Pilot Point Methodology for Automated Calibration of an Ensemble of Conditionally Simulated Transmissivity Fields: 2. Application , 1995 .