Posteriors, conjugacy, and exponential families for completely random measures

We demonstrate how to calculate posteriors for general CRM-based priors and likelihoods for Bayesian nonparametric models. We further show how to represent Bayesian nonparametric priors as a sequence of finite draws using a size-biasing approach---and how to represent full Bayesian nonparametric models via finite marginals. Motivated by conjugate priors based on exponential family representations of likelihoods, we introduce a notion of exponential families for CRMs, which we call exponential CRMs. This construction allows us to specify automatic Bayesian nonparametric conjugate priors for exponential CRM likelihoods. We demonstrate that our exponential CRMs allow particularly straightforward recipes for size-biased and marginal representations of Bayesian nonparametric models. Along the way, we prove that the gamma process is a conjugate prior for the Poisson likelihood process and the beta prime process is a conjugate prior for a process we call the odds Bernoulli process. We deliver a size-biased representation of the gamma process and a marginal representation of the gamma process coupled with a Poisson likelihood process.

[1]  J. Kingman,et al.  Completely random measures. , 1967 .

[2]  M. Degroot Optimal Statistical Decisions , 1970 .

[3]  T. Ferguson A Bayesian Analysis of Some Nonparametric Problems , 1973 .

[4]  T. Ferguson Prior Distributions on Spaces of Probability Measures , 1974 .

[5]  K. Doksum Tailfree and Neutral Random Probabilities and Their Posterior Distributions , 1974 .

[6]  P. Diaconis,et al.  Conjugate Priors for Exponential Families , 1979 .

[7]  Albert Y. Lo,et al.  Bayesian nonparametric statistical inference for Poisson point processes , 1982 .

[8]  Albert Y. Lo,et al.  On a Class of Bayesian Nonparametric Estimates: I. Density Estimates , 1984 .

[9]  N. Hjort Nonparametric Bayes Estimators Based on Beta Processes in Models for Life History Data , 1990 .

[10]  J. Sethuraman A CONSTRUCTIVE DEFINITION OF DIRICHLET PRIORS , 1991 .

[11]  J. Pitman,et al.  Size-biased sampling of Poisson point processes and excursions , 1992 .

[12]  M. Escobar Estimating Normal Means with a Dirichlet Process Prior , 1994 .

[13]  S. MacEachern Estimating normal means with a conjugate style dirichlet process prior , 1994 .

[14]  M. Escobar,et al.  Bayesian Density Estimation and Inference Using Mixtures , 1995 .

[15]  J. Pitman Some developments of the Blackwell-MacQueen urn scheme , 1996 .

[16]  J. Pitman Random discrete distributions invariant under size-biased permutation , 1996, Advances in Applied Probability.

[17]  Michael A. West,et al.  Computing Nonparametric Hierarchical Models , 1998 .

[18]  Yongdai Kim NONPARAMETRIC BAYESIAN ESTIMATORS FOR COUNTING PROCESSES , 1999 .

[19]  P. Damlen,et al.  Gibbs sampling for Bayesian non‐conjugate and hierarchical models by using auxiliary variables , 1999 .

[20]  Radford M. Neal Markov Chain Sampling Methods for Dirichlet Process Mixture Models , 2000 .

[21]  Lancelot F. James,et al.  Gibbs Sampling Methods for Stick-Breaking Priors , 2001 .

[22]  Radford M. Neal Slice Sampling , 2003, The Annals of Statistics.

[23]  D. Goldstein Statistics and science : a Festschrift for Terry Speed , 2003 .

[24]  Michael I. Jordan,et al.  Latent Dirichlet Allocation , 2001, J. Mach. Learn. Res..

[25]  Thomas L. Griffiths,et al.  Infinite latent feature models and the Indian buffet process , 2005, NIPS.

[26]  Michael A. West,et al.  Hierarchical priors and mixture models, with applications in regression and density estimation , 2006 .

[27]  Michael I. Jordan,et al.  Hierarchical Dirichlet Processes , 2006 .

[28]  Stephen G. Walker,et al.  Sampling the Dirichlet Mixture Model with Slices , 2006, Commun. Stat. Simul. Comput..

[29]  Michalis K. Titsias,et al.  The Infinite Gamma-Poisson Feature Model , 2007, NIPS.

[30]  Michael I. Jordan,et al.  Hierarchical Beta Processes and the Indian Buffet Process , 2007, AISTATS.

[31]  Yee Whye Teh,et al.  Stick-breaking Construction for the Indian Buffet Process , 2007, AISTATS.

[32]  Michael I. Jordan,et al.  Nonparametric bayesian models for machine learning , 2008 .

[33]  Michael,et al.  On a Class of Bayesian Nonparametric Estimates : I . Density Estimates , 2008 .

[34]  Yee Whye Teh,et al.  Variational Inference for the Indian Buffet Process , 2009, AISTATS.

[35]  A. Lijoi,et al.  Models Beyond the Dirichlet Process , 2009 .

[36]  Y. Teh,et al.  Indian Buffet Processes with Power-law Behavior , 2009, NIPS.

[37]  Lancelot F. James,et al.  Posterior Analysis for Normalized Random Measures with Independent Increments , 2009 .

[38]  Lawrence Carin,et al.  A Stick-Breaking Construction of the Beta Process , 2010, ICML.

[39]  Peter Orbanz,et al.  Conjugate Projective Limits , 2010, 1012.0363.

[40]  Stephen G. Walker,et al.  Slice sampling mixture models , 2011, Stat. Comput..

[41]  Michael I. Jordan,et al.  Beta Processes, Stick-Breaking and Power Laws , 2011, 1106.0539.

[42]  Lawrence Carin,et al.  Variational Inference for Stick-Breaking Beta Process Priors , 2011, ICML.

[43]  Michael I. Jordan,et al.  Stick-Breaking Beta Processes and the Poisson Process , 2012, AISTATS.

[44]  David B. Dunson,et al.  Beta-Negative Binomial Process and Poisson Factor Analysis , 2011, AISTATS.

[45]  Michael I. Jordan,et al.  Cluster and Feature Modeling from Combinatorial Stochastic Processes , 2012, 1206.5862.

[46]  Chong Wang,et al.  Variational inference in nonconjugate models , 2012, J. Mach. Learn. Res..

[47]  Lancelot F. James Poisson Latent Feature Calculus for Generalized Indian Buffet Processes , 2014, 1411.2936.

[48]  Michael I. Jordan,et al.  Combinatorial Clustering and the Beta Negative Binomial Process , 2011, IEEE Transactions on Pattern Analysis and Machine Intelligence.