Alternative effective sample size measures for importance sampling

The Effective Sample Size (ESS) is an important measure of efficiency in the Importance Sampling (IS) technique. A well-known approximation of the theoretical ESS definition, involving the inverse of the sum of the squares of the normalized importance weights, is widely applied in literature. This expression has become an essential piece within Sequential Monte Carlo (SMC) methods, using adaptive resampling procedures. In this work, first we show that this ESS approximation is related to the Euclidean distance between the probability mass function (pmf) described by the normalized weights and the uniform pmf. Then, we derive other possible ESS functions based on different discrepancy measures. In our study, we also include another ESS measure called perplexity, already proposed in literature, that is based on the discrete entropy of the normalized weights. We compare all of them by means of numerical simulations.

[1]  Nicholas G. Polson,et al.  Particle Filtering , 2006 .

[2]  Luca Martino,et al.  Improving population Monte Carlo: Alternative weighting and resampling schemes , 2016, Signal Process..

[3]  J. Marin,et al.  Population Monte Carlo , 2004 .

[4]  Jean-Michel Marin,et al.  Adaptive importance sampling in general mixture classes , 2007, Stat. Comput..

[5]  Jun S. Liu,et al.  Sequential Imputations and Bayesian Missing Data Problems , 1994 .

[6]  Yuji Matsumoto,et al.  Particle Filter , 2022 .

[7]  Christian P. Robert,et al.  Introducing Monte Carlo Methods with R , 2009 .

[8]  Jukka Corander,et al.  An adaptive population importance sampler , 2014, 2014 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP).

[9]  Luca Martino,et al.  Cooperative parallel particle filters for online model selection and applications to urban mobility , 2015, Digit. Signal Process..

[10]  Luca Martino,et al.  Effective sample size for importance sampling based on discrepancy measures , 2016, Signal Process..

[11]  Thomas M. Cover,et al.  Elements of Information Theory , 2005 .

[12]  P. Diaconis,et al.  The sample size required in importance sampling , 2015, 1511.01437.

[13]  Tim Hesterberg,et al.  Monte Carlo Strategies in Scientific Computing , 2002, Technometrics.

[14]  N. Gordon,et al.  Novel approach to nonlinear/non-Gaussian Bayesian state estimation , 1993 .

[15]  Mónica F. Bugallo,et al.  Adaptive importance sampling in signal processing , 2015, Digit. Signal Process..

[16]  A. Doucet,et al.  A Tutorial on Particle Filtering and Smoothing: Fifteen years later , 2008 .

[17]  Hoon Kim,et al.  Monte Carlo Statistical Methods , 2000, Technometrics.

[18]  R. Carroll,et al.  Advanced Markov Chain Monte Carlo Methods: Learning from Past Samples , 2010 .

[19]  Jukka Corander,et al.  An Adaptive Population Importance Sampler: Learning From Uncertainty , 2015, IEEE Transactions on Signal Processing.

[20]  Nando de Freitas,et al.  Sequential Monte Carlo Methods in Practice , 2001, Statistics for Engineering and Information Science.

[21]  S. E. Ahmed,et al.  Markov Chain Monte Carlo: Stochastic Simulation for Bayesian Inference , 2008, Technometrics.