Iterative Design of Experiments by Non-Linear PLS Models. A Case Study: The Reservoir Simulator Data to Forecast Oil Production

In this paper we present a way of conducting design of experiments by Multivariate Additive Partial Least-Squares Splines models, in short MAPLSS. In the framework of optimal experimental design based on small samples, in order to select the most informative MAPLSS model, we process an adaptive incremental selection of observations by a particular bootstrap procedure. Why MAPLSS models? Because they inherit the advantages of the PLS regression that permits to capture additively non-linear main effects and relevant interactions in the difficult framework of small samples. The effectiveness of this approach is illustrated on the reservoir simulator data used to forecast oil production.

[1]  Toby J. Mitchell,et al.  An Algorithm for the Construction of “D-Optimal” Experimental Designs , 2000, Technometrics.

[2]  Robert Tibshirani,et al.  An Introduction to the Bootstrap , 1994 .

[3]  B. Peter BOOSTING FOR HIGH-DIMENSIONAL LINEAR MODELS , 2006 .

[4]  R. D. Veaux,et al.  Model building in multivariate additive partial least squares splines via the GCV criterion , 2009 .

[5]  N. Gilardi,et al.  Design of experiments by committee of neural networks , 2004, 2004 IEEE International Joint Conference on Neural Networks (IEEE Cat. No.04CH37541).

[6]  Anders Krogh,et al.  Neural Network Ensembles, Cross Validation, and Active Learning , 1994, NIPS.

[7]  Robert Sabatier,et al.  Additive splines for partial least squares regression , 1997 .

[8]  P. Bühlmann,et al.  Boosting with the L2-loss: regression and classification , 2001 .

[9]  S. Wold,et al.  The multivariate calibration problem in chemistry solved by the PLS method , 1983 .

[10]  Jean-Pierre Gauchi,et al.  Selecting both latent and explanatory variables in the PLS1 regression model , 2003 .

[11]  J. Durand,et al.  Local polynomial additive regression through PLS and splines: PLSS , 2001 .

[12]  Jean-François Durand La régression Partial Least-Squares boostée , 2008, Monde des Util. Anal. Données.

[13]  Robert Tibshirani,et al.  The Elements of Statistical Learning: Data Mining, Inference, and Prediction, 2nd Edition , 2001, Springer Series in Statistics.

[14]  W. J. Studden,et al.  Theory Of Optimal Experiments , 1972 .

[15]  Interaction Terms in Non-linear PLS via Additive Spline Transformation , 2003 .

[16]  D. Ruppert The Elements of Statistical Learning: Data Mining, Inference, and Prediction , 2004 .

[17]  Stephen M. S. Lee,et al.  Optimal Bootstrap Sample Size in Construction of Percentile Confidence Bounds , 2001 .

[18]  Trevor Hastie,et al.  The Elements of Statistical Learning , 2001 .

[19]  M. Forina,et al.  Iterative predictor weighting (IPW) PLS: a technique for the elimination of useless predictors in regression problems , 1999 .

[20]  Jean-Pierre Gauchi,et al.  Comparison of selection methods of explanatory variables in PLS regression with application to manufacturing process data , 2001 .

[21]  Michel Tenenhaus La r?gression PLS: th?orie et pratique , 1998 .

[22]  P. Bühlmann,et al.  Boosting With the L2 Loss , 2003 .