Slice sampling mixture models

We propose a more efficient version of the slice sampler for Dirichlet process mixture models described by Walker (Commun. Stat., Simul. Comput. 36:45–54, 2007). This new sampler allows for the fitting of infinite mixture models with a wide-range of prior specifications. To illustrate this flexibility we consider priors defined through infinite sequences of independent positive random variables. Two applications are considered: density estimation using mixture models and hazard function estimation. In each case we show how the slice efficient sampler can be applied to make inference in the models. In the mixture case, two submodels are studied in detail. The first one assumes that the positive random variables are Gamma distributed and the second assumes that they are inverse-Gaussian distributed. Both priors have two hyperparameters and we consider their effect on the prior distribution of the number of occupied clusters in a sample. Extensive computational comparisons with alternative “conditional” simulation techniques for mixture models using the standard Dirichlet process prior and our new priors are made. The properties of the new priors are illustrated on a density estimation problem.

[1]  T. Ferguson A Bayesian Analysis of Some Nonparametric Problems , 1973 .

[2]  L. Devroye Non-Uniform Random Variate Generation , 1986 .

[3]  Adrian F. M. Smith,et al.  Bayesian computation via the gibbs sampler and related markov chain monte carlo methods (with discus , 1993 .

[4]  J. Sethuraman A CONSTRUCTIVE DEFINITION OF DIRICHLET PRIORS , 1991 .

[5]  Peter E. Rossi,et al.  Bayesian Analysis of Stochastic Volatility Models , 1994 .

[6]  M. Escobar Estimating Normal Means with a Dirichlet Process Prior , 1994 .

[7]  S. MacEachern Estimating normal means with a conjugate style dirichlet process prior , 1994 .

[8]  Walter R. Gilks,et al.  Adaptive rejection metropolis sampling , 1995 .

[9]  W. Gilks,et al.  Adaptive Rejection Metropolis Sampling Within Gibbs Sampling , 1995 .

[10]  M. Escobar,et al.  Bayesian Density Estimation and Inference Using Mixtures , 1995 .

[11]  A. Sokal Monte Carlo Methods in Statistical Mechanics: Foundations and New Algorithms , 1997 .

[12]  S. MacEachern,et al.  Estimating mixture of dirichlet process models , 1998 .

[13]  M. Escobar,et al.  Markov Chain Sampling Methods for Dirichlet Process Mixture Models , 2000 .

[14]  C. Robert,et al.  Computational and Inferential Difficulties with Mixture Posterior Distributions , 2000 .

[15]  H. Ishwaran,et al.  Markov chain Monte Carlo in approximate Dirichlet and beta two-parameter process hierarchical models , 2000 .

[16]  Lancelot F. James,et al.  Gibbs Sampling Methods for Stick-Breaking Priors , 2001 .

[17]  P. Green,et al.  Modelling Heterogeneity With and Without the Dirichlet Process , 2001 .

[18]  Michael,et al.  On a Class of Bayesian Nonparametric Estimates : I . Density Estimates , 2008 .

[19]  Peter E. Rossi,et al.  Bayesian analysis of stochastic volatility models with fat-tails and correlated errors , 2004 .

[20]  Ramsés H. Mena,et al.  Hierarchical Mixture Modeling With Normalized Inverse-Gaussian Priors , 2005 .

[21]  Stephen G. Walker,et al.  Sampling the Dirichlet Mixture Model with Slices , 2006, Commun. Stat. Simul. Comput..

[22]  Ramsés H. Mena,et al.  Controlling the reinforcement in Bayesian non‐parametric mixture models , 2007 .

[23]  G. Roberts,et al.  Retrospective Markov chain Monte Carlo methods for Dirichlet process hierarchical models , 2007, 0710.4228.

[24]  Yee Whye Teh,et al.  Beam sampling for the infinite hidden Markov model , 2008, ICML '08.

[25]  D. Dunson Kernel local partition processes for functional data , 2008 .

[26]  O. Papaspiliopoulos A note on posterior sampling from Dirichlet mixture models , 2008 .

[27]  C. Yau,et al.  Bayesian Nonparametric Hidden Markov Models with application to the analysis of copy-number-variation in mammalian genomes. , 2011, Journal of the Royal Statistical Society. Series B, Statistical methodology.