Reweighting Monte Carlo Mixtures

Markov chain Monte Carlo (e. g., the Metropolis algorithm, Hastings algorithm, and Gibbs sampler) is a general multivariate simulation method applicable to a wide range of problems. It permits sampling from any stochastic process whose density is known up to a constant of proportionality. The Gibbs sampler has recently received much attention as a method of simulating from posterior distributions in Bayesian inference, but Markov chain Monte Carlo is no less important in frequentist inference with applications in maximum likelihood, hypothesis testing, and the parametric bootstrap. It is most useful when combined with importance reweighting so that a Monte Carlo sample from one distribution can be used for inference about many distributions. In Bayesian inference, reweighting permits the calculation of posteriors corresponding to a range of priors using a Monte Carlo sample from just one posterior. In likelihood inference, reweighting permits the calculation of the whole likelihood function using a Monte Carlo sample from just one distribution in the model. Given this estimate of the likelihood, a parametric bootstrap calculation of the sampling distribution of the maximum likelihood estimate can be done using just one more Monte Carlo sample. Although reweighting can save much calculation, it does not work well unless the distribution being reweighted places appreciable mass in all regions of interest. Hence it is often not advisable to sample from a distribution in the model. Reweighting a mixture of distributions in the model may perform much better. But using such a mixture gives rise to another problem when the densities are known only up to constants of proportionality. These normalizing constants must be calculated to obtain the mixture density. Direct Monte Carlo estimation, though possible, is very ine cient. A new method, reverse logistic regression, accurately estimates these constants, permitting the use of these mixture estimates in Markov chain Monte Carlo.

[1]  J. Besag,et al.  Sequential Monte Carlo p-values , 1991 .

[2]  S. Haberman Concavity and estimation , 1989 .

[3]  Y. Ogata,et al.  Likelihood estimation of soft-core interaction potentials for Gibbsian point patterns , 1989 .

[4]  N. Metropolis,et al.  Equation of State Calculations by Fast Computing Machines , 1953, Resonance.

[5]  Alan J. Mayne,et al.  Generalized Inverse of Matrices and its Applications , 1972 .

[6]  Jesper Møller Discussion of C.J. Geyer and E.A. Thompson (1992): Constrained Monte Carlo maximum likelihood for dependent data , 1992 .

[7]  Adrian F. M. Smith,et al.  Sampling-Based Approaches to Calculating Marginal Densities , 1990 .

[8]  J. Besag,et al.  Generalized Monte Carlo significance tests , 1989 .

[9]  J. Besag,et al.  Bayesian image restoration, with two applications in spatial statistics , 1991 .

[10]  E. L. Lehmann,et al.  Theory of point estimation , 1950 .

[11]  Y. Ogata,et al.  Likelihood Analysis of Spatial Point Patterns , 1984 .

[12]  Kung-Sik Chan Asymptotic behavior of the Gibbs sampler , 1993 .

[13]  C. Geyer,et al.  Constrained Monte Carlo Maximum Likelihood for Dependent Data , 1992 .

[14]  David Strauss On a general class of models for interaction , 1986 .

[15]  Donald Geman,et al.  Stochastic Relaxation, Gibbs Distributions, and the Bayesian Restoration of Images , 1984, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[16]  Y. Ogata,et al.  Estimation of interaction potentials of spatial point patterns through the maximum likelihood procedure , 1981 .

[17]  W. K. Hastings,et al.  Monte Carlo Sampling Methods Using Markov Chains and Their Applications , 1970 .

[18]  J. Anderson Separate sample logistic discrimination , 1972 .