A survey of Monte Carlo methods for parameter estimation

Statistical signal processing applications usually require the estimation of some parameters of interest given a set of observed data. These estimates are typically obtained either by solving a multi-variate optimization problem, as in the maximum likelihood (ML) or maximum a posteriori (MAP) estimators, or by performing a multi-dimensional integration, as in the minimum mean squared error (MMSE) estimators. Unfortunately, analytical expressions for these estimators cannot be found in most real-world applications, and the Monte Carlo (MC) methodology is one feasible approach. MC methods proceed by drawing random samples, either from the desired distribution or from a simpler one, and using them to compute consistent estimators. The most important families of MC algorithms are the Markov chain MC (MCMC) and importance sampling (IS). On the one hand, MCMC methods draw samples from a proposal density, building then an ergodic Markov chain whose stationary distribution is the desired distribution by accepting or rejecting those candidate samples as the new state of the chain. On the other hand, IS techniques draw samples from a simple proposal density and then assign them suitable weights that measure their quality in some appropriate way. In this paper, we perform a thorough review of MC methods for the estimation of static parameters in signal processing applications. A historical note on the development of MC schemes is also provided, followed by the basic MC method and a brief description of the rejection sampling (RS) algorithm, as well as three sections describing many of the most relevant MCMC and IS algorithms, and their combined use. Finally, five numerical examples (including the estimation of the parameters of a chaotic system, a localization problem in wireless sensor networks and a spectral analysis application) are provided in order to demonstrate the performance of the described approaches.

[1]  Ta-Hsin Li,et al.  Blind deconvolution of discrete-valued signals , 1993, Proceedings of 27th Asilomar Conference on Signals, Systems and Computers.

[2]  T. Kloek,et al.  Bayesian estimates of equation system parameters, An application of integration by Monte Carlo , 1976 .

[3]  Mónica F. Bugallo,et al.  Sequential Monte Carlo methods under model uncertainty , 2016, 2016 IEEE Statistical Signal Processing Workshop (SSP).

[4]  Luca Martino,et al.  Fully adaptive Gaussian mixture Metropolis-Hastings algorithm , 2012, 2013 IEEE International Conference on Acoustics, Speech and Signal Processing.

[5]  David Draper,et al.  Pólya Urn Latent Dirichlet Allocation: A Doubly Sparse Massively Parallel Sampler , 2017, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[6]  Lester W. Mackey,et al.  Measuring Sample Quality with Kernels , 2017, ICML.

[7]  Don H. Johnson,et al.  Statistical Signal Processing , 2009, Encyclopedia of Biometrics.

[8]  A. Doucet,et al.  A Tutorial on Particle Filtering and Smoothing: Fifteen years later , 2008 .

[9]  Marija Vucelja,et al.  Lifting -- A nonreversible Markov chain Monte Carlo Algorithm , 2014, 1412.8762.

[10]  Dirk P. Kroese,et al.  Handbook of Monte Carlo Methods , 2011 .

[11]  William J. Fitzgerald,et al.  Markov chain Monte Carlo methods with applications to signal processing , 2001, Signal Process..

[12]  Wolfgang Hörmann,et al.  Transformed density rejection with inflection points , 2013, Stat. Comput..

[13]  G. Casella,et al.  Explaining the Gibbs Sampler , 1992 .

[14]  Timothy J. Robinson,et al.  Sequential Monte Carlo Methods in Practice , 2003 .

[15]  A. Doucet,et al.  Particle Markov chain Monte Carlo methods , 2010 .

[16]  C. Fox,et al.  Markov chain Monte Carlo Using an Approximation , 2005 .

[17]  Wolfgang Hörmann,et al.  Automatic Nonuniform Random Variate Generation , 2011 .

[18]  Radford M. Neal Slice Sampling , 2003, The Annals of Statistics.

[19]  François Septier,et al.  An improved SIR-based Sequential Monte Carlo algorithm , 2016, 2016 IEEE Statistical Signal Processing Workshop (SSP).

[20]  R. Tweedie,et al.  Rates of convergence of the Hastings and Metropolis algorithms , 1996 .

[21]  Luca Martino,et al.  Extremely efficient acceptance-rejection method for simulating uncorrelated Nakagami fading channels , 2014, Commun. Stat. Simul. Comput..

[22]  Ingmar Schuster,et al.  Gradient Importance Sampling , 2015, 1507.05781.

[23]  Mónica F. Bugallo,et al.  Efficient linear fusion of partial estimators , 2014, Digit. Signal Process..

[24]  Jun Zhu,et al.  Big Learning with Bayesian Methods , 2014, ArXiv.

[25]  Leland Stewart,et al.  Multiparameter Univariate Bayesian Analysis , 1979 .

[26]  J.E. Mazo,et al.  Digital communications , 1985, Proceedings of the IEEE.

[27]  S. Zabell,et al.  On Student's 1908 Article “The Probable Error of a Mean” , 2008 .

[28]  Andrew Thomas,et al.  WinBUGS - A Bayesian modelling framework: Concepts, structure, and extensibility , 2000, Stat. Comput..

[29]  Joaquín Míguez,et al.  A proof of uniform convergence over time for a distributed particle filter , 2015, Signal Process..

[30]  Van Trees,et al.  Detection, Estimation, and Modulation Theory. Part 1 - Detection, Estimation, and Linear Modulation Theory. , 1968 .

[31]  G. Fort,et al.  Limit theorems for some adaptive MCMC algorithms with subgeometric kernels , 2008, 0807.2952.

[32]  Jack J. Dongarra,et al.  Guest Editors Introduction to the top 10 algorithms , 2000, Comput. Sci. Eng..

[33]  Simo Särkkä,et al.  Posterior inference on parameters of stochastic differential equations via non-linear Gaussian filtering and adaptive MCMC , 2015, Stat. Comput..

[34]  S. Duane,et al.  Hybrid Monte Carlo , 1987 .

[35]  Fredrik Lindsten,et al.  Nested Sequential Monte Carlo Methods , 2015, ICML.

[36]  Luca Martino,et al.  A multi-model particle filtering algorithm for indoor tracking of mobile terminals using RSS data , 2009, 2009 IEEE Control Applications, (CCA) & Intelligent Control, (ISIC).

[37]  Andreas M. Ali,et al.  An Empirical Study of Collaborative Acoustic Source Localization , 2007, 2007 6th International Symposium on Information Processing in Sensor Networks.

[38]  Luca Martino,et al.  Adaptive independent sticky MCMC algorithms , 2013, EURASIP J. Adv. Signal Process..

[39]  L. Martino,et al.  Adaptive rejection sampling with fixed number of nodes , 2015, Commun. Stat. Simul. Comput..

[40]  W. Wong,et al.  The calculation of posterior distributions by data augmentation , 1987 .

[41]  Zhi Ding,et al.  Blind Equalization and Identification , 2001 .

[42]  George Sugihara,et al.  Model-free forecasting outperforms the correct mechanistic model for simulated and experimental data , 2013, Proceedings of the National Academy of Sciences.

[43]  J. J. Rajan,et al.  Bayesian approach to parameter estimation and interpolation of time-varying autoregressive processes using the Gibbs sampler , 1997 .

[44]  Joaquín Míguez,et al.  A population Monte Carlo scheme with transformed weights and its application to stochastic kinetic models , 2012, Stat. Comput..

[45]  Anthony Brockwell Parallel Markov chain Monte Carlo Simulation by Pre-Fetching , 2006 .

[46]  Saul I. Gass,et al.  Model World: Tales from the Time Line - The Definition of OR and the Origins of Monte Carlo Simulation , 2005, Interfaces.

[47]  A. Gelman,et al.  Stan , 2015 .

[48]  Andrew Golightly,et al.  Adaptive, Delayed-Acceptance MCMC for Targets With Expensive Likelihoods , 2015, 1509.00172.

[49]  Rong Chen,et al.  Monte Carlo Bayesian Signal Processing for Wireless Communications , 2002, J. VLSI Signal Process..

[50]  Simo Särkkä,et al.  Combining particle MCMC with Rao-Blackwellized Monte Carlo data association for parameter estimation in multiple target tracking , 2014, Digit. Signal Process..

[51]  Aaron Smith,et al.  MCMC for Imbalanced Categorical Data , 2016, Journal of the American Statistical Association.

[52]  Mats Gyllenberg,et al.  Bayesian model learning based on a parallel MCMC strategy , 2006, Stat. Comput..

[53]  Stephen M. Krone,et al.  Markov Chain Monte Carlo in small worlds , 2006, Stat. Comput..

[54]  S. Walker Invited comment on the paper "Slice Sampling" by Radford Neal , 2003 .

[55]  Chao Yang,et al.  Learn From Thy Neighbor: Parallel-Chain and Regional Adaptive MCMC , 2009 .

[56]  Gareth Roberts,et al.  Optimal scalings for local Metropolis--Hastings chains on nonproduct targets in high dimensions , 2009, 0908.0865.

[57]  D. Rubin,et al.  Inference from Iterative Simulation Using Multiple Sequences , 1992 .

[58]  Simo Särkkä,et al.  Adaptive Metropolis algorithm using variational Bayesian adaptive Kalman filter , 2013, Comput. Stat. Data Anal..

[59]  Luca Martino,et al.  Independent Doubly Adaptive Rejection Metropolis Sampling Within Gibbs Sampling , 2012, IEEE Transactions on Signal Processing.

[60]  John P. Huelsenbeck,et al.  MRBAYES: Bayesian inference of phylogenetic trees , 2001, Bioinform..

[61]  T. H. O'Beirne Puzzles and Paradoxes , 1965 .

[62]  R. Carroll,et al.  Advanced Markov Chain Monte Carlo Methods: Learning from Past Samples , 2010 .

[63]  Luca Martino Parsimonious Adaptive Rejection Sampling , 2017 .

[64]  A. Rollett,et al.  The Monte Carlo Method , 2004 .

[65]  FRANCIS GALTON Dice for Statistical Experiments , 1890, Nature.

[66]  Branko Ristic,et al.  Particle Filters for Random Set Models , 2013 .

[67]  B. Carlin,et al.  On the Convergence of Successive Substitution Sampling , 1992 .

[68]  Patrick Duvaut,et al.  Fully Bayesian analysis of Hidden Markov models , 1996, 1996 8th European Signal Processing Conference (EUSIPCO 1996).

[69]  J. Read,et al.  A multi-point Metropolis scheme with generic weight functions , 2011, 1112.4048.

[70]  Luca Martino,et al.  Independent doubly Adaptive Rejection Metropolis Sampling , 2014, 2014 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP).

[71]  Wolfgang Hörmann,et al.  A rejection technique for sampling from T-concave distributions , 1995, TOMS.

[72]  M. Karim Generalized Linear Models With Random Effects , 1991 .

[73]  J. Gentle Random number generation and Monte Carlo methods , 1998 .

[74]  Eric Moulines,et al.  Simulation-based methods for blind maximum-likelihood filter identification , 1999, Signal Process..

[75]  Jiqiang Guo,et al.  Stan: A Probabilistic Programming Language. , 2017, Journal of statistical software.

[76]  Eric Moulines,et al.  On parallel implementation of sequential Monte Carlo methods: the island particle model , 2013, Stat. Comput..

[77]  Y. Teh,et al.  Concave-Convex Adaptive Rejection Sampling , 2011 .

[78]  Luca Martino,et al.  A review of multiple try MCMC algorithms for signal processing , 2017, Digit. Signal Process..

[79]  J. Geweke,et al.  Bayesian estimation of state-space models using the Metropolis-Hastings algorithm within Gibbs sampling , 2001 .

[80]  Joachim H. Ahrens,et al.  Computer methods for sampling from gamma, beta, poisson and bionomial distributions , 1974, Computing.

[81]  Paolo Giudici,et al.  Nonparametric Convergence Assessment for MCMC Model Selection , 2003 .

[82]  Gerard T. Barkema,et al.  Monte Carlo methods beyond detailed balance , 2015 .

[83]  D. Spiegelhalter,et al.  Modelling Complexity: Applications of Gibbs Sampling in Medicine , 1993 .

[84]  Mónica F. Bugallo,et al.  Adaptive importance sampling in signal processing , 2015, Digit. Signal Process..

[85]  Scott L. Zeger,et al.  Generalized linear models with random e ects: a Gibbs sampling approach , 1991 .

[86]  Robert L. Smith,et al.  Hit-and-run algorithms for the identification of nonredundant linear inequalities , 1987, Math. Program..

[87]  J. Rosenthal,et al.  Error Bounds for Approximations of Geometrically Ergodic Markov Chains , 2017 .

[88]  Heikki Haario,et al.  Efficient MCMC for Climate Model Parameter Estimation: Parallel Adaptive Chains and Early Rejection , 2012 .

[89]  Mónica F. Bugallo,et al.  Heretical Multiple Importance Sampling , 2016, IEEE Signal Processing Letters.

[90]  Sally Rosenthal,et al.  Parallel computing and Monte Carlo algorithms , 1999 .

[91]  Anthony Lee,et al.  Stability of noisy Metropolis–Hastings , 2015, Stat. Comput..

[92]  Luca Martino,et al.  Improving population Monte Carlo: Alternative weighting and resampling schemes , 2016, Signal Process..

[93]  L. Devroye Non-Uniform Random Variate Generation , 1986 .

[94]  Yoram Baram,et al.  Manifold Stochastic Dynamics for Bayesian Learning , 1999, Neural Computation.

[95]  K. Murthy,et al.  Analytical Results of Variance Reduction Characteristics of Biased Monte Carlo for Deep-Penetration Problems , 1986 .

[96]  Luca Martino,et al.  Parallel metropolis chains with cooperative adaptation , 2015, 2016 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP).

[97]  L. Mark Berliner,et al.  Subsampling the Gibbs Sampler , 1994 .

[98]  Jean-Marie Cornuet,et al.  Adaptive Multiple Importance Sampling , 2009, 0907.1254.

[99]  Heikki Haario,et al.  DRAM: Efficient adaptive MCMC , 2006, Stat. Comput..

[100]  H. Hotelling British Statistics and Statisticians Today , 1930 .

[101]  Wj Fitzgerald,et al.  Interpolation of missing samples for audio restoration , 1994 .

[102]  Tom E. Bishop,et al.  Blind Deconvolution , 2014, Computer Vision, A Reference Guide.

[103]  H. Vincent Poor,et al.  Suppression of multiple narrowband interferers in a spread-spectrum communication system , 2000, IEEE Journal on Selected Areas in Communications.

[104]  M. N. Shanmukha Swamy,et al.  A nonlinear adaptive filter for narrowband interference mitigation in spread spectrum systems , 2005, Signal Process..

[105]  Andrew Gelman,et al.  The No-U-turn sampler: adaptively setting path lengths in Hamiltonian Monte Carlo , 2011, J. Mach. Learn. Res..

[106]  J. Geweke,et al.  Bayesian Inference in Econometric Models Using Monte Carlo Integration , 1989 .

[107]  Bruno Tuffin,et al.  Markov chain importance sampling with applications to rare event probability estimation , 2011, Stat. Comput..

[108]  Roberto Casarin,et al.  Interacting multiple try algorithms with different proposal distributions , 2010, Statistics and Computing.

[109]  Art B. Owen,et al.  Statistically Efficient Thinning of a Markov Chain Sampler , 2015, ArXiv.

[110]  Mónica F. Bugallo,et al.  Bias correction for distributed Bayesian estimators , 2015, 2015 IEEE 6th International Workshop on Computational Advances in Multi-Sensor Adaptive Processing (CAMSAP).

[111]  C. Geyer,et al.  Annealing Markov chain Monte Carlo with applications to ancestral inference , 1995 .

[112]  Andrew M. Stuart,et al.  Importance Sampling: Computational Complexity and Intrinsic Dimension , 2015 .

[113]  A. Barker Monte Carlo calculations of the radial distribution functions for a proton-electron plasma , 1965 .

[114]  Luca Martino,et al.  Generalized rejection sampling schemes and applications in signal processing , 2009, Signal Process..

[115]  Zhenzhou Lu,et al.  A novel adaptive importance sampling algorithm based on Markov chain and low-discrepancy sequence , 2013 .

[116]  Ben Calderhead,et al.  A general construction for parallelizing Metropolis−Hastings algorithms , 2014, Proceedings of the National Academy of Sciences.

[117]  Ingvar Strid Efficient parallelisation of Metropolis-Hastings algorithms using a prefetching approach , 2010, Comput. Stat. Data Anal..

[118]  Luca Martino,et al.  Independent Random Sampling Methods , 2018 .

[119]  L. Martino,et al.  Issues in the Multiple Try Metropolis mixing , 2017, Comput. Stat..

[120]  M. Tanner,et al.  Facilitating the Gibbs Sampler: The Gibbs Stopper and the Griddy-Gibbs Sampler , 1992 .

[121]  R. Cheng,et al.  The Generation of Gamma Variables with Non‐Integral Shape Parameter , 1977 .

[122]  Luca Martino,et al.  Different acceptance functions for Multiple Try Metropolis schemes , 2012 .

[123]  Wolfgang Hörmann,et al.  A universal generator for discrete log-concave distributions , 1994, Computing.

[124]  Matti Vihola,et al.  Robust adaptive Metropolis algorithm with coerced acceptance rate , 2010, Statistics and Computing.

[125]  By W. R. GILKSt,et al.  Adaptive Rejection Sampling for Gibbs Sampling , 2010 .

[126]  Jukka Corander,et al.  An Adaptive Population Importance Sampler: Learning From Uncertainty , 2015, IEEE Transactions on Signal Processing.

[127]  Radford M. Neal MCMC Using Ensembles of States for Problems with Fast and Slow Variables such as Gaussian Process Regression , 2011, 1101.0387.

[128]  Luca Martino,et al.  A novel rejection sampling scheme for posterior probability distributions , 2009, 2009 IEEE International Conference on Acoustics, Speech and Signal Processing.

[129]  Harald Niederreiter,et al.  Random number generation and Quasi-Monte Carlo methods , 1992, CBMS-NSF regional conference series in applied mathematics.

[130]  P. Laplace Théorie analytique des probabilités , 1995 .

[131]  Pierre Comon,et al.  Handbook of Blind Source Separation: Independent Component Analysis and Applications , 2010 .

[132]  Luca Martino,et al.  Cooperative parallel particle filters for online model selection and applications to urban mobility , 2015, Digit. Signal Process..

[133]  Simon J. Godsill,et al.  Bayesian Enhancement of Speech and Audio Signals which can be Modelled as ARMA Processes , 1997 .

[134]  N. Chopin,et al.  Control functionals for Monte Carlo integration , 2014, 1410.2392.

[135]  G. Casella,et al.  Rao-Blackwellisation of sampling schemes , 1996 .

[136]  H. Kahn Random sampling (Monte Carlo) techniques in neutron attenuation problems--II. , 1950, Nucleonics.

[137]  L. Tierney Markov Chains for Exploring Posterior Distributions , 1994 .

[138]  J. Geweke,et al.  On markov chain monte carlo methods for nonlinear and non-gaussian state-space models , 1999 .

[139]  George Sugihara,et al.  Reply to Hartig and Dormann: The true model myth , 2013, Proceedings of the National Academy of Sciences.

[140]  Noel Balzer The paradoxes , 1992 .

[141]  R. Tweedie,et al.  Geometric convergence and central limit theorems for multidimensional Hastings and Metropolis algorithms , 1996 .

[142]  Stephen M. Stigler,et al.  Stochastic Simulation in the Nineteenth Century , 1991 .

[143]  T. Hesterberg,et al.  Weighted Average Importance Sampling and Defensive Mixture Distributions , 1995 .

[144]  F BugalloMónica,et al.  Improving population Monte Carlo , 2017 .

[145]  J. M. Sanz-Serna,et al.  Optimal tuning of the hybrid Monte Carlo algorithm , 2010, 1001.4460.

[146]  Xiangyu Wang,et al.  Parallelizing MCMC with Random Partition Trees , 2015, NIPS.

[147]  G. Roberts,et al.  Convergence of adaptive direction sampling , 1994 .

[148]  V. Elvira,et al.  Metropolis Sampling , 2017, 1704.04629.

[149]  Luca Martino,et al.  Group Importance Sampling for Particle Filtering and MCMC , 2017, Digit. Signal Process..

[150]  Daniel Simpson,et al.  Asynchronous Gibbs Sampling , 2015, AISTATS.

[151]  G. Roberts,et al.  OPTIMAL SCALING FOR PARTIALLY UPDATING MCMC ALGORITHMS , 2006, math/0607054.

[152]  Petar M. Djuric,et al.  Resampling algorithms and architectures for distributed particle filters , 2005, IEEE Transactions on Signal Processing.

[153]  Bradley P. Carlin,et al.  Markov Chain Monte Carlo conver-gence diagnostics: a comparative review , 1996 .

[154]  Daniel M. Roy,et al.  CONVERGENCE OF SEQUENTIAL MONTE CARLO-BASED SAMPLING METHODS , 2015 .

[155]  Augustus De Morgan A Budget of Paradoxes , 1873, Nature.

[156]  W. K. Hastings,et al.  Monte Carlo Sampling Methods Using Markov Chains and Their Applications , 1970 .

[157]  Jukka Corander,et al.  MCMC-Driven Adaptive Multiple Importance Sampling , 2015 .

[158]  Anthony Lee,et al.  On the role of interaction in sequential Monte Carlo algorithms , 2013, 1309.2918.

[159]  Jukka Corander,et al.  An adaptive population importance sampler , 2014, 2014 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP).

[160]  Lester W. Mackey,et al.  Measuring Sample Quality with Diffusions , 2016, The Annals of Applied Probability.

[161]  Matti Vihola,et al.  On the stability and ergodicity of adaptive scaling Metropolis algorithms , 2009, 0903.4061.

[162]  Ajay Jasra,et al.  On population-based simulation for static inference , 2007, Stat. Comput..

[163]  David Dunson,et al.  Geometrically Tempered Hamiltonian Monte Carlo , 2016, 1604.00872.

[164]  Christophe Andrieu,et al.  A tutorial on adaptive MCMC , 2008, Stat. Comput..

[165]  Robert L. Smith,et al.  Direction Choice for Accelerated Convergence in Hit-and-Run Sampling , 1998, Oper. Res..

[166]  Paul Marjoram,et al.  Markov chain Monte Carlo without likelihoods , 2003, Proceedings of the National Academy of Sciences of the United States of America.

[167]  Charles J. Geyer,et al.  Practical Markov Chain Monte Carlo , 1992 .

[168]  Luca Martino,et al.  A generalization of the adaptive rejection sampling algorithm , 2010, Stat. Comput..

[169]  J. Rosenthal,et al.  Coupling and Ergodicity of Adaptive Markov Chain Monte Carlo Algorithms , 2007, Journal of Applied Probability.

[170]  Luca Martino,et al.  Almost rejectionless sampling from Nakagami-m distributions (m⩾1) , 2012 .

[171]  Petar M. Djuric,et al.  Monte Carlo methods for signal processing: Recent advances , 2004, 2004 12th European Signal Processing Conference.

[172]  N. Metropolis THE BEGINNING of the MONTE CARLO METHOD , 2022 .

[173]  Robert L. Smith,et al.  Efficient Monte Carlo Procedures for Generating Points Uniformly Distributed over Bounded Regions , 1984, Oper. Res..

[174]  Xiangyu Wang,et al.  Parallelizing MCMC via Weierstrass Sampler , 2013, 1312.4605.

[175]  Jean-Michel Marin,et al.  Approximate Bayesian computational methods , 2011, Statistics and Computing.

[176]  Gareth Roberts,et al.  Air Markov Chain Monte Carlo , 2018, 1801.09309.

[177]  Nicholas G. Polson,et al.  Particle Filtering , 2006 .

[178]  Juho Kokkala Particle and Sigma-Point Methods for State and Parameter Estimation in Nonlinear Dynamic Systems , 2016 .

[179]  Andrew Gelman,et al.  General methods for monitoring convergence of iterative simulations , 1998 .

[180]  Xiaodong Wang,et al.  Monte Carlo methods for signal processing: a review in the statistical signal processing context , 2005, IEEE Signal Processing Magazine.

[181]  Rudolf Wolf's Contribution to the Buffon Needle Problem (an Early Monte Carlo Experiment) and Application of Least Squares , 1990 .

[182]  Radford M. Neal MCMC Using Hamiltonian Dynamics , 2011, 1206.1901.

[183]  Zoubin Ghahramani,et al.  MCMC for Doubly-intractable Distributions , 2006, UAI.

[184]  Simo Srkk,et al.  Bayesian Filtering and Smoothing , 2013 .

[185]  Christian P. Robert,et al.  Bayesian computation: a summary of the current state, and samples backwards and forwards , 2015, Statistics and Computing.

[186]  L Tierney,et al.  Some adaptive monte carlo methods for Bayesian inference. , 1999, Statistics in medicine.

[187]  Qiang Liu,et al.  A Kernelized Stein Discrepancy for Goodness-of-fit Tests , 2016, ICML.

[188]  M. Bédard Optimal acceptance rates for Metropolis algorithms: Moving beyond 0.234 , 2008 .

[189]  Christian P. Robert,et al.  Introducing Monte Carlo Methods with R , 2009 .

[190]  D B Rubin,et al.  Markov chain Monte Carlo methods in biostatistics , 1996, Statistical methods in medical research.

[191]  Mónica F. Bugallo,et al.  Efficient Multiple Importance Sampling Estimators , 2015, IEEE Signal Processing Letters.

[192]  P. Moral,et al.  Sequential Monte Carlo samplers , 2002, cond-mat/0212648.

[193]  Jun S. Liu,et al.  The Collapsed Gibbs Sampler in Bayesian Computations with Applications to a Gene Regulation Problem , 1994 .

[194]  Christiane Lemieux,et al.  Acceleration of the Multiple-Try Metropolis algorithm using antithetic and stratified sampling , 2007, Stat. Comput..

[195]  Peter J. W. Rayner,et al.  Parameter estimation of time-varying autoregressive models using the Gibbs sampler , 1995 .

[196]  S. Walker,et al.  Sampling Truncated Normal, Beta, and Gamma Densities , 2001 .

[197]  Branko Ristic,et al.  Beyond the Kalman Filter: Particle Filters for Tracking Applications , 2004 .

[198]  Michael Betancourt,et al.  A Conceptual Introduction to Hamiltonian Monte Carlo , 2017, 1701.02434.

[199]  Student,et al.  THE PROBABLE ERROR OF A MEAN , 1908 .

[200]  Ambuj,et al.  Monte Carlo Simulation of Computer System Availability / Reliability Models , 2001 .

[201]  H. V. Trees,et al.  Part I. Detection, Estimation, and Linear Modulation Theory , 2013 .

[202]  Adrian F. M. Smith,et al.  Sampling-Based Approaches to Calculating Marginal Densities , 1990 .

[203]  J. Rosenthal,et al.  Surprising Convergence Properties of Some Simple Gibbs Samplers under Various Scans , 2015 .

[204]  Ananthram Swami,et al.  Wireless Sensor Networks: Signal Processing and Communications , 2007 .

[205]  Martyn Plummer,et al.  JAGS: A program for analysis of Bayesian graphical models using Gibbs sampling , 2003 .

[206]  M. Gutmann,et al.  Approximate Bayesian Computation , 2019, Annual Review of Statistics and Its Application.

[207]  D. A. Bromley,et al.  From X-Rays to Quarks—Modern Physicists and Their Discoveries , 1981 .

[208]  Alexandre H. Thi'ery,et al.  Optimal Scaling and Diffusion Limits for the Langevin Algorithm in High Dimensions , 2011, 1103.0542.

[209]  K. C. Ho,et al.  Linear prediction approach for efficient frequency estimation of multiple real sinusoids: algorithms and analyses , 2005, IEEE Transactions on Signal Processing.

[210]  Luca Martino,et al.  Effective sample size for importance sampling based on discrepancy measures , 2016, Signal Process..

[211]  M. Evans,et al.  Random Variable Generation Using Concavity Properties of Transformed Densities , 1998 .

[212]  Stephen J. Roberts,et al.  A tutorial on variational Bayesian inference , 2012, Artificial Intelligence Review.

[213]  Dazhuan Xu,et al.  Highly efficient rejection method for generating Nakagami-m sequences , 2011 .

[214]  David Luengo,et al.  Generalized Multiple Importance Sampling , 2015, Statistical Science.

[215]  Petar M. Djuric,et al.  Guest editorial special issue on monte carlo methods for statistical signal processing , 2002, IEEE Trans. Signal Process..

[216]  Jukka Corander,et al.  Orthogonal parallel MCMC methods for sampling and optimization , 2015, Digit. Signal Process..

[217]  David J. Lunn,et al.  The BUGS Book: A Practical Introduction to Bayesian Analysis , 2013 .

[218]  Eric Moulines,et al.  Scaling analysis of multiple-try MCMC methods , 2012 .

[219]  Jesse Read,et al.  A distributed particle filter for nonlinear tracking in wireless sensor networks , 2014, Signal Process..

[220]  James O. Berger,et al.  The Effective Sample Size , 2014 .

[221]  Ken D. Sauer,et al.  ML parameter estimation for Markov random fields with applications to Bayesian tomography , 1998, IEEE Trans. Image Process..

[222]  Bruce R. Davis An Improved Importance Sampling Method for Digital Communication System Simulations , 1986, IEEE Trans. Commun..

[223]  Ken D. Sauer,et al.  Tractable models and efficient algorithms for Bayesian tomography , 1995, 1995 International Conference on Acoustics, Speech, and Signal Processing.

[224]  Christian P. Robert,et al.  Monte Carlo Statistical Methods , 2005, Springer Texts in Statistics.

[225]  Petar M. Djuric,et al.  Adaptive Importance Sampling: The past, the present, and the future , 2017, IEEE Signal Processing Magazine.

[226]  John K Kruschke,et al.  Bayesian data analysis. , 2010, Wiley interdisciplinary reviews. Cognitive science.

[227]  J. Rosenthal,et al.  Optimal Scaling of Metropolis Algorithms : Is 0 . 234 as Robust as is Believed ? , 2007 .

[228]  David B. Hitchcock,et al.  A History of the Metropolis–Hastings Algorithm , 2003 .

[229]  Simon J. Godsill,et al.  Robust noise reduction for speech and audio signals , 1996, 1996 IEEE International Conference on Acoustics, Speech, and Signal Processing Conference Proceedings.

[230]  Tim Hesterberg,et al.  Importance sampling for Bayesian estimation , 1992 .

[231]  Rong Chen,et al.  Blind restoration of linearly degraded discrete signals by Gibbs sampling , 1995, IEEE Trans. Signal Process..

[232]  Tim Hesterberg,et al.  Monte Carlo Strategies in Scientific Computing , 2002, Technometrics.

[233]  N. Metropolis,et al.  Equation of State Calculations by Fast Computing Machines , 1953, Resonance.

[234]  David Draper,et al.  GPU-accelerated Gibbs sampling: a case study of the Horseshoe Probit model , 2016, Stat. Comput..

[235]  Christian P. Robert,et al.  Accelerating MCMC algorithms , 2018, Wiley interdisciplinary reviews. Computational statistics.

[236]  Lee Badger,et al.  Lazzarini's Lucky Approximation of π , 1994 .

[237]  John W. Fisher,et al.  Nonparametric belief propagation for self-localization of sensor networks , 2005, IEEE Journal on Selected Areas in Communications.

[238]  Martin Weigel,et al.  Non-reversible Monte Carlo simulations of spin models , 2011, Comput. Phys. Commun..

[239]  S. Chib,et al.  Understanding the Metropolis-Hastings Algorithm , 1995 .

[240]  A. Gelman,et al.  Weak convergence and optimal scaling of random walk Metropolis algorithms , 1997 .

[241]  Man-Suk Oh,et al.  Adaptive importance sampling in monte carlo integration , 1992 .

[242]  Moon Gi Kang,et al.  Super-resolution image reconstruction , 2010, 2010 International Conference on Computer Application and System Modeling (ICCASM 2010).

[243]  Leonidas J. Guibas,et al.  Optimally combining sampling techniques for Monte Carlo rendering , 1995, SIGGRAPH.

[244]  Nando de Freitas,et al.  An Introduction to MCMC for Machine Learning , 2004, Machine Learning.

[245]  Yuguo Chen Another look at rejection sampling through importance sampling , 2005 .

[246]  H. Kahn,et al.  Random sampling (Monte Carlo) techniques in neutron attenuation problems--I. , 1950, Nucleonics.

[247]  G. Roberts,et al.  Langevin Diffusions and Metropolis-Hastings Algorithms , 2002 .

[248]  Ronald W. Golland,et al.  Principles of Random Variate Generation , 1990 .

[249]  Norman C. Beaulieu,et al.  Efficient Nakagami-m fading channel Simulation , 2005, IEEE Transactions on Vehicular Technology.

[250]  François Septier,et al.  Independent Resampling Sequential Monte Carlo Algorithms , 2016, IEEE Transactions on Signal Processing.

[251]  Chong Wang,et al.  Asymptotically Exact, Embarrassingly Parallel MCMC , 2013, UAI.

[252]  Joris Bierkens,et al.  Non-reversible Metropolis-Hastings , 2014, Stat. Comput..

[253]  M. Betancourt,et al.  Optimizing The Integrator Step Size for Hamiltonian Monte Carlo , 2014, 1411.6669.

[254]  Mylène Bédard Efficient Sampling Using Metropolis Algorithms: Applications of Optimal Scaling Results , 2008 .

[255]  Scott C. Douglas,et al.  Adaptive algorithms for the rejection of sinusoidal disturbances with unknown frequency , 1996, Autom..

[256]  Simo Särkkä,et al.  Parameter estimation in stochastic differential equations with Markov chain Monte Carlo and non-linear Kalman filtering , 2012, Computational Statistics.

[257]  A. Brix Bayesian Data Analysis, 2nd edn , 2005 .

[258]  Andrew Gelman,et al.  Handbook of Markov Chain Monte Carlo , 2011 .

[259]  Student Probable Error of a Correlation Coefficient , 1908 .

[260]  Luca Martino,et al.  Adaptive population importance samplers: A general perspective , 2016, 2016 IEEE Sensor Array and Multichannel Signal Processing Workshop (SAM).

[261]  Jukka Corander,et al.  A fast universal self-tuned sampler within Gibbs sampling , 2014, Digit. Signal Process..

[262]  Alexey A. Bobtsov,et al.  Cancelation of unknown multiharmonic disturbance for nonlinear plant with input delay , 2012 .

[263]  Jean-Michel Marin,et al.  Adaptive importance sampling in general mixture classes , 2007, Stat. Comput..

[264]  Jukka Corander,et al.  A gradient adaptive population importance sampler , 2015, 2015 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP).

[265]  Jun S. Liu,et al.  Sequential Imputations and Bayesian Missing Data Problems , 1994 .

[266]  P. Giordani,et al.  Adaptive Independent Metropolis–Hastings by Fast Estimation of Mixtures of Normals , 2008, 0801.1864.

[267]  Lester W. Mackey,et al.  Measuring Sample Quality with Stein's Method , 2015, NIPS.

[268]  H. Haario,et al.  An adaptive Metropolis algorithm , 2001 .

[269]  J. Geweke Bayesian comparison of econometric models , 1994 .

[270]  Jukka Corander,et al.  Parallell interacting MCMC for learning of topologies of graphical models , 2008, Data Mining and Knowledge Discovery.

[271]  P. Moral,et al.  Convergence properties of weighted particle islands with application to the double bootstrap algorithm , 2014, 1410.4231.

[272]  Fredrik Lindsten,et al.  Particle gibbs with ancestor sampling , 2014, J. Mach. Learn. Res..

[273]  S. Kay Fundamentals of statistical signal processing: estimation theory , 1993 .

[274]  Luca Martino,et al.  Weighting a resampled particle in Sequential Monte Carlo , 2016, 2016 IEEE Statistical Signal Processing Workshop (SSP).

[275]  Fabrizio Leisen,et al.  ON MULTIPLE TRY SCHEMES AND THE PARTICLE METROPOLIS-HASTINGS ALGORITHM , 2014 .

[276]  Arthur Gretton,et al.  A Kernel Test of Goodness of Fit , 2016, ICML.

[277]  Lingyu Chen,et al.  Exploring Hybrid Monte Carlo in Bayesian Computation , 2000 .

[278]  Renate Meyer,et al.  Adaptive rejection Metropolis sampling using Lagrange interpolation polynomials of degree 2 , 2008, Comput. Stat. Data Anal..

[279]  M. Evans,et al.  Methods for Approximating Integrals in Statistics with Special Emphasis on Bayesian Integration Problems , 1995 .

[280]  Radford M. Neal Improving Asymptotic Variance of MCMC Estimators: Non-reversible Chains are Better , 2004, math/0407281.

[281]  Karl Pearson,et al.  Random sampling numbers , 2022 .

[282]  Robert L. Wolpert,et al.  Statistical Inference , 2019, Encyclopedia of Social Network Analysis and Mining.

[283]  D. Rubin,et al.  Maximum likelihood from incomplete data via the EM - algorithm plus discussions on the paper , 1977 .

[284]  Jukka Corander,et al.  Layered adaptive importance sampling , 2015, Statistics and Computing.

[285]  Luca Martino,et al.  Smelly parallel MCMC chains , 2015, 2015 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP).

[286]  Wolfgang Hh Ormann,et al.  A Universal Generator for Bivariate Log-concave Distributions , 1995 .

[287]  Andrew Thomas,et al.  The BUGS project: Evolution, critique and future directions , 2009, Statistics in medicine.

[288]  Arnaud Doucet,et al.  On Markov chain Monte Carlo methods for tall data , 2015, J. Mach. Learn. Res..

[289]  M. Newton Approximate Bayesian-inference With the Weighted Likelihood Bootstrap , 1994 .

[290]  Fredrik Lindsten,et al.  High-Dimensional Filtering Using Nested Sequential Monte Carlo , 2016, IEEE Transactions on Signal Processing.

[291]  Luca Martino,et al.  The Recycling Gibbs sampler for efficient learning , 2016, Digit. Signal Process..

[292]  Simo Särkkä,et al.  Bayesian Filtering and Smoothing , 2013, Institute of Mathematical Statistics textbooks.

[293]  Donald Geman,et al.  Stochastic Relaxation, Gibbs Distributions, and the Bayesian Restoration of Images , 1984, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[294]  Ryan P. Adams,et al.  Firefly Monte Carlo: Exact MCMC with Subsets of Data , 2014, UAI.

[295]  T. Faniran Numerical Solution of Stochastic Differential Equations , 2015 .

[296]  D.G. Tzikas,et al.  The variational approximation for Bayesian inference , 2008, IEEE Signal Processing Magazine.

[297]  M. Betancourt,et al.  Adiabatic Monte Carlo , 2014 .

[298]  Yukito IBA,et al.  Population Monte Carlo algorithms , 2000, cond-mat/0008226.

[299]  P. Peskun,et al.  Optimum Monte-Carlo sampling using Markov chains , 1973 .

[300]  Sandhya Dwarkadas,et al.  Parallel Metropolis coupled Markov chain Monte Carlo for Bayesian phylogenetic inference , 2002, Bioinform..

[301]  J. Marin,et al.  Population Monte Carlo , 2004 .

[302]  W. Gilks,et al.  Adaptive Rejection Metropolis Sampling Within Gibbs Sampling , 1995 .

[303]  Andrew L. Nguyen,et al.  Regenerative Markov Chain Importance Sampling , 2015, Commun. Stat. Simul. Comput..

[304]  Anthony N. Pettitt,et al.  A Sequential Monte Carlo Algorithm to Incorporate Model Uncertainty in Bayesian Sequential Design , 2014 .

[305]  S. Godsill,et al.  Bayesian blind deconvolution for mobile communications , 1997 .

[306]  S. E. Ahmed,et al.  Markov Chain Monte Carlo: Stochastic Simulation for Bayesian Inference , 2008, Technometrics.

[307]  J. Rosenthal,et al.  Optimal scaling of discrete approximations to Langevin diffusions , 1998 .

[308]  Edward I. George,et al.  Bayes and big data: the consensus Monte Carlo algorithm , 2016, Big Data and Information Theory.

[309]  A. Owen,et al.  Safe and Effective Importance Sampling , 2000 .

[310]  C. D. Gelatt,et al.  Optimization by Simulated Annealing , 1983, Science.

[311]  N. Gordon,et al.  Novel approach to nonlinear/non-Gaussian Bayesian state estimation , 1993 .

[312]  M A Newton,et al.  Bayesian Phylogenetic Inference via Markov Chain Monte Carlo Methods , 1999, Biometrics.

[313]  Stephen M. Krone,et al.  Small-world MCMC and convergence to multi-modal distributions: From slow mixing to fast mixing , 2007 .

[314]  M. Betancourt,et al.  The Geometric Foundations of Hamiltonian Monte Carlo , 2014, 1410.5110.

[315]  William A. Link,et al.  On thinning of chains in MCMC , 2012 .

[316]  B. Schmeiser,et al.  Performance of the Gibbs, Hit-and-Run, and Metropolis Samplers , 1993 .

[317]  Michael Chertkov,et al.  Irreversible Monte Carlo Algorithms for Efficient Sampling , 2008, ArXiv.

[318]  C. Fox,et al.  Coupled MCMC with a randomized acceptance probability , 2012, 1205.6857.

[319]  Jonathan C. Mattingly,et al.  Optimal approximating Markov chains for Bayesian inference , 2015, 1508.03387.

[320]  Michel C. Jeruchim,et al.  Developments in the Theory and Application of Importance Sampling , 1987, IEEE Trans. Commun..

[321]  M. Girolami,et al.  Riemann manifold Langevin and Hamiltonian Monte Carlo methods , 2011, Journal of the Royal Statistical Society: Series B (Statistical Methodology).

[322]  Thomas E. Booth A Monte Carlo Learning/Biasing Experiment with Intelligent Random Numbers , 1986 .

[323]  C. Andrieu,et al.  The pseudo-marginal approach for efficient Monte Carlo computations , 2009, 0903.5480.

[324]  Pierre Alquier,et al.  Noisy Monte Carlo: convergence of Markov chains with approximate transition kernels , 2014, Statistics and Computing.

[325]  Petre Stoica List of references on spectral line analysis , 1993, Signal Process..

[326]  M. Beaumont Estimation of population growth or decline in genetically monitored populations. , 2003, Genetics.

[327]  Sebastian Reich,et al.  Explicit variable step-size and time-reversible integration , 2001 .

[328]  Jun S. Liu,et al.  The Multiple-Try Method and Local Optimization in Metropolis Sampling , 2000 .

[329]  L. H. C. Tippett,et al.  ON THE EXTREME INDIVIDUALS AND THE RANGE OF SAMPLES TAKEN FROM A NORMAL POPULATION , 1925 .

[330]  H. Kahn,et al.  Methods of Reducing Sample Size in Monte Carlo Computations , 1953, Oper. Res..

[331]  Cuthbert C. Hurd,et al.  A Note on Early Monte Carlo Computations and Scientific Meetings , 1985, Annals of the History of Computing.

[332]  Jukka Corander,et al.  An adaptive population importance sampler , 2014, 2014 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP).

[333]  Walter R. Gilks,et al.  Adaptive Direction Sampling , 1994 .

[334]  Leonidas J. Guibas,et al.  Wireless sensor networks - an information processing approach , 2004, The Morgan Kaufmann series in networking.

[335]  J. Møller,et al.  An efficient Markov chain Monte Carlo method for distributions with intractable normalising constants , 2006 .

[336]  Luca Martino,et al.  On the flexibility of the design of multiple try Metropolis schemes , 2012, Computational Statistics.

[337]  C. Geyer Markov Chain Monte Carlo Maximum Likelihood , 1991 .

[338]  James V. Candy,et al.  Bayesian Signal Processing: Classical, Modern and Particle Filtering Methods , 2009 .