An iterative version of the adaptive Gaussian mixture filter

The adaptive Gaussian mixture filter (AGM) was introduced as a robust filter technique for large-scale applications and an alternative to the well-known ensemble Kalman filter (EnKF). It consists of two analysis steps, one linear update and one weighting/resampling step. The bias of AGM is determined by two parameters, one adaptive weight parameter (forcing the weights to be more uniform to avoid filter collapse) and one predetermined bandwidth parameter which decides the size of the linear update. It has been shown that if the adaptive parameter approaches one and the bandwidth parameter decreases, as an increasing function of the sample size, the filter can achieve asymptotic optimality. For large-scale applications with a limited sample size, the filter solution may be far from optimal as the adaptive parameter gets close to zero depending on how well the samples from the prior distribution match the data. The bandwidth parameter must often be selected significantly different from zero in order to make large enough linear updates to match the data, at the expense of bias in the estimates. In the iterative AGM we introduce here, we take advantage of the fact that the history matching problem is usually estimation of parameters and initial conditions. If the prior distribution of initial conditions and parameters is close to the posterior distribution, it is possible to match the historical data with a small bandwidth parameter and an adaptive weight parameter that gets close to one. Hence, the bias of the filter solution is small. In order to obtain this scenario, we iteratively run the AGM throughout the data history with a very small bandwidth to create a new prior distribution from the updated samples after each iteration. After a few iterations, nearly all samples from the previous iteration match the data, and the above scenario is achieved. A simple toy problem shows that it is possible to reconstruct the true posterior distribution using the iterative version of the AGM. Then a 2D synthetic reservoir is revisited to demonstrate the potential of the new method on large-scale problems.

[1]  Nando de Freitas,et al.  Sequential Monte Carlo Methods in Practice , 2001, Statistics for Engineering and Information Science.

[2]  Andreas Størksen Stordal,et al.  Sequential Data Assimilation in High Dimensional Nonlinear Systems , 2011 .

[3]  Dean S. Oliver,et al.  Filtering with state space localized Kalman gain , 2012 .

[4]  Geir Nævdal,et al.  Evaluation of EnKF and Variants on the PUNQ-S3 Case , 2012 .

[5]  Geir Nævdal,et al.  Comparing the adaptive Gaussian mixture filter with the ensemble Kalman filter on synthetic reservoir models , 2012, Computational Geosciences.

[6]  G. Evensen,et al.  Analysis Scheme in the Ensemble Kalman Filter , 1998 .

[7]  Dinh-Tuan Pham,et al.  A New Approximate Solution of the Optimal Nonlinear Filter for Data Assimilation in Meteorology and Oceanography , 2008 .

[8]  Neil Gordon,et al.  Bayesian methods for tracking , 1993 .

[9]  Geir Nævdal,et al.  Refined Adaptive Gaussian Mixture Filter - Application on a Real Field Case , 2012 .

[10]  A. Raftery,et al.  Local Adaptive Importance Sampling for Multivariate Densities with Strong Nonlinear Relationships , 1996 .

[11]  N. Gordon,et al.  Novel approach to nonlinear/non-Gaussian Bayesian state estimation , 1993 .

[12]  J. Geweke,et al.  Bayesian Inference in Econometric Models Using Monte Carlo Integration , 1989 .

[13]  G. Evensen Sequential data assimilation with a nonlinear quasi‐geostrophic model using Monte Carlo methods to forecast error statistics , 1994 .

[14]  A. Stordal,et al.  Bridging the ensemble Kalman filter and particle filters: the adaptive Gaussian mixture filter , 2011 .

[15]  P. Bickel,et al.  Curse-of-dimensionality revisited: Collapse of the particle filter in very large scale systems , 2008, 0805.3034.

[16]  Albert C. Reynolds,et al.  Combining the Ensemble Kalman Filter With Markov-Chain Monte Carlo for Improved History Matching and Uncertainty Characterization , 2012 .

[17]  Chris Snyder,et al.  Toward a nonlinear ensemble filter for high‐dimensional systems , 2003 .

[18]  Christian P. Robert,et al.  Monte Carlo Statistical Methods , 2005, Springer Texts in Statistics.

[19]  B. Silverman Density estimation for statistics and data analysis , 1986 .