Compressed Monte Carlo for Distributed Bayesian Inference

Bayesian models have become very popular over the last years in several fields such as signal processing, statistics and machine learning. Bayesian inference needs the approximation of complicated integrals involving the posterior distribution. For this purpose, Monte Carlo (MC) methods, such as Markov Chain Monte Carlo (MCMC) and Importance Sampling (IS) algorithms, are often employed. In this work, we introduce theory and practice of a Compressed MC (C-MC) scheme, in order to compress the information contained in a could of samples. CMC is particularly useful in a distributed Bayesian inference framework, when cheap and fast communications with a central processor are required. In its basic version, C-MC is strictly related to the stratification technique, a well-known method used for variance reduction purposes. Deterministic C-MC schemes are also presented, which provide very good performance. The compression problem is strictly related to moment matching approach applied in different filtering methods, often known as Gaussian quadrature rules or sigma-point methods. The connections to herding algorithms and quasi-Monte Carlo perspective are also discussed. Numerical results confirm the benefit of the introduced schemes, outperforming the corresponding benchmark methods.

[1]  Luca Martino,et al.  Weighting a resampled particle in Sequential Monte Carlo , 2016, 2016 IEEE Statistical Signal Processing Workshop (SSP).

[2]  Jun S. Liu,et al.  Monte Carlo strategies in scientific computing , 2001 .

[3]  Jeffrey K. Uhlmann,et al.  Unscented filtering and nonlinear estimation , 2004, Proceedings of the IEEE.

[4]  Jean-Michel Marin,et al.  Adaptive importance sampling in general mixture classes , 2007, Stat. Comput..

[5]  A.S. Willsky,et al.  Distributed fusion in sensor networks , 2006, IEEE Signal Processing Magazine.

[6]  Lizhong Xu,et al.  Sequential quasi-Monte Carlo filter for visual object tracking , 2012, World Automation Congress 2012.

[7]  Eric Moulines,et al.  On parallel implementation of sequential Monte Carlo methods: the island particle model , 2013, Stat. Comput..

[8]  Fredrik Lindsten,et al.  Sequential Kernel Herding: Frank-Wolfe Optimization for Particle Filtering , 2015, AISTATS.

[9]  Max D. Morris,et al.  Minimax and Maximin Distance Designs for Computer Experiments with Derivative Information , 2011 .

[10]  Alexander J. Smola,et al.  Super-Samples from Kernel Herding , 2010, UAI.

[11]  William T. Freeman,et al.  Efficient Multiscale Sampling from Products of Gaussian Mixtures , 2003, NIPS.

[12]  Roman Garnett,et al.  Exact Sampling from Determinantal Point Processes , 2016, ArXiv.

[13]  Neil J. Gordon,et al.  A tutorial on particle filters for online nonlinear/non-Gaussian Bayesian tracking , 2002, IEEE Trans. Signal Process..

[14]  Christian P. Robert,et al.  Monte Carlo Statistical Methods , 2005, Springer Texts in Statistics.

[15]  Petar M. Djuric,et al.  Adaptive Importance Sampling: The past, the present, and the future , 2017, IEEE Signal Processing Magazine.

[16]  S. Haykin,et al.  Cubature Kalman Filters , 2009, IEEE Transactions on Automatic Control.

[17]  L. Pronzato Minimax and maximin space-filling designs: some properties and methods for construction , 2017 .

[18]  Marcelo G. S. Bruno,et al.  Consensus-based distributed particle filtering algorithms for cooperative blind equalization in receiver networks , 2011, 2011 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP).

[19]  J. Marin,et al.  Population Monte Carlo , 2004 .