Bayesian posterior comprehension via Message from Monte Carlo

We discuss the problem of producing an epitome, or brief summary, of a Bayesian posterior distribution and then investigate a general solution based on the Minimum Message Length (MML) principle. Clearly, the optimal criterion for choosing such an epitome is determined by the epitome’s intended use. The interesting general case is where this use is unknown since, in order to be practical, the choice of epitome criterion becomes subjective. We identify a number of desirable properties that an epitome could have facilitation of point estimation, human comprehension, and fast approximation of posterior expectations. We call these the properties of Bayesian Posterior Comprehension and show that the Minimum Message Length principle can be viewed as an epitome criterion that produces epitomes having these properties. We then present and extend Message from Monte Carlo as a means for constructing instantaneous Minimum Message Length codebooks (and epitomes) using Markov Chain Monte Carlo methods. The Message from Monte Carlo methodology is illustrated for binary regression, generalised linear model, and multiple change-point problems.

[1]  Chris S. Wallace,et al.  The Complexity of Strict Minimum Message Length Inference , 2002, Comput. J..

[2]  Farid Kianifard,et al.  Models for Repeated Measurements , 2001, Technometrics.

[3]  Adrian F. M. Smith,et al.  Automatic Bayesian curve fitting , 1998 .

[4]  David L. Dowe,et al.  Minimum Message Length and Kolmogorov Complexity , 1999, Comput. J..

[5]  I. Johnstone,et al.  Ideal spatial adaptation by wavelet shrinkage , 1994 .

[6]  Roy Robertson,et al.  An Introduction to Statistical Modelling , 2000, Technometrics.

[7]  J. Jackson Wiley Series in Probability and Mathematical Statistics , 2004 .

[8]  P. Green Reversible jump Markov chain Monte Carlo computation and Bayesian model determination , 1995 .

[9]  Chris S. Wallace,et al.  A Program for Numerical Classification , 1970, Comput. J..

[10]  David L. Dowe,et al.  Point Estimation Using the Kullback-Leibler Loss Function and MML , 1998, PAKDD.

[11]  Lloyd Allison,et al.  Univariate Polynomial Inference by Monte Carlo Message Length Approximation , 2002, ICML.

[12]  P. Rousseeuw,et al.  Wiley Series in Probability and Mathematical Statistics , 2005 .

[13]  C. S. Wallace,et al.  An Information Measure for Classification , 1968, Comput. J..

[14]  D. Madigan,et al.  Model Selection and Accounting for Model Uncertainty in Graphical Models Using Occam's Window , 1994 .

[15]  Peter Green,et al.  Markov chain Monte Carlo in Practice , 1996 .

[16]  Walter R. Gilks,et al.  A Language and Program for Complex Bayesian Modelling , 1994 .

[17]  D. Madigan,et al.  Bayesian Model Averaging for Linear Regression Models , 1997 .