In this letter, we consider a variational approximate Bayesian inference framework, latent-space variational Bayes (LSVB), in the general context of conjugate-exponential family models with latent variables. In the LSVB approach, we integrate out model parameters in an exact way and then perform the variational inference over only the latent variables. It can be shown that LSVB can achieve better estimates of the model evidence as well as the distribution over the latent variables than the popular variational Bayesian expectation-maximization (VBEM). However, the distribution over the latent variables in LSVB has to be approximated in practice. As an approximate implementation of LSVB, we propose a second-order LSVB (SoLSVB) method. In particular, VBEM can be derived as a special case of a first-order approximation in LSVB (Sung). SoLSVB can capture higher order statistics neglected in VBEM and can therefore achieve a better approximation. Examples of Gaussian mixture models are used to illustrate the comparison between our method and VBEM, demonstrating the improvement.
[1]
Thomas Hofmann,et al.
A Collapsed Variational Bayesian Inference Algorithm for Latent Dirichlet Allocation
,
2007
.
[2]
Zoubin Ghahramani,et al.
Latent-Space Variational Bayes
,
2008,
IEEE Transactions on Pattern Analysis and Machine Intelligence.
[3]
Matthew J. Beal,et al.
The variational Bayesian EM algorithm for incomplete data: with application to scoring graphical model structures
,
2003
.
[4]
James O. Berger,et al.
Ockham's razor and Bayesian analysis. [statistical theory for systems evaluation]
,
1992
.
[5]
Hagai Attias,et al.
Inferring Parameters and Structure of Latent Variable Models by Variational Bayes
,
1999,
UAI.
[6]
Lawrence D. Brown.
Fundamentals of Statistical Exponential Families
,
1987
.
[7]
Radford M. Neal.
Pattern Recognition and Machine Learning
,
2007,
Technometrics.