Heretical Multiple Importance Sampling

Multiple importance sampling (MIS) methods approximate moments of complicated distributions by drawing samples from a set of proposal distributions. Several ways to compute the importance weights assigned to each sample have been recently proposed, with the so-called deterministic mixture (DM) weights providing the best performance in terms of variance, at the expense of an increase in the computational cost. A recent work has shown that it is possible to achieve a tradeoff between variance reduction and computational effort by performing an a priori random clustering of the proposals (partial DM algorithm). In this paper, we propose a novel “heretical” MIS framework, where the clustering is performed a posteriori with the goal of reducing the variance of the importance sampling weights. This approach yields biased estimators with a potentially large reduction in variance. Numerical examples show that heretical MIS estimators can outperform, in terms of mean squared error, both the standard and the partial MIS estimators, achieving a performance close to that of DM with less computational cost.

[1]  H. Kahn,et al.  Methods of Reducing Sample Size in Monte Carlo Computations , 1953, Oper. Res..

[2]  Tim Hesterberg,et al.  Monte Carlo Strategies in Scientific Computing , 2002, Technometrics.

[3]  Jukka Corander,et al.  An adaptive population importance sampler , 2014, 2014 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP).

[4]  A. Owen,et al.  Safe and Effective Importance Sampling , 2000 .

[5]  Yonina C. Eldar Rethinking Biased Estimation: Improving Maximum Likelihood and the Cramér-Rao Bound , 2008, Found. Trends Signal Process..

[6]  Maurizio Dapor Monte Carlo Strategies , 2020, Transport of Energetic Electrons in Solids.

[7]  Mónica F. Bugallo,et al.  Efficient Multiple Importance Sampling Estimators , 2015, IEEE Signal Processing Letters.

[8]  Jukka Corander,et al.  An Adaptive Population Importance Sampler: Learning From Uncertainty , 2015, IEEE Transactions on Signal Processing.

[9]  Joaquín Míguez,et al.  A population Monte Carlo scheme with transformed weights and its application to stochastic kinetic models , 2012, Stat. Comput..

[10]  Yonina C. Eldar,et al.  Rethinking biased estimation [Lecture Notes] , 2008, IEEE Signal Processing Magazine.

[11]  A. Gelman,et al.  Pareto Smoothed Importance Sampling , 2015, 1507.02646.

[12]  David Luengo,et al.  Generalized Multiple Importance Sampling , 2015, Statistical Science.

[13]  Hoon Kim,et al.  Monte Carlo Statistical Methods , 2000, Technometrics.

[14]  E. Ionides Truncated Importance Sampling , 2008 .

[15]  Leonidas J. Guibas,et al.  Optimally combining sampling techniques for Monte Carlo rendering , 1995, SIGGRAPH.

[16]  Yonina C. Eldar,et al.  Rethinking Biased Estimation , 2008 .

[17]  A. Doucet,et al.  A Tutorial on Particle Filtering and Smoothing: Fifteen years later , 2008 .

[18]  Jean-Marie Cornuet,et al.  Adaptive Multiple Importance Sampling , 2009, 0907.1254.

[19]  O. Cappé,et al.  Population Monte Carlo , 2004 .