Lower bounds of stochastic complexities in variational Bayes learning of Gaussian mixture models

The Bayesian learning is widely used and proved to be effective in many data modelling problems. However, computations involved in it require huge costs and generally cannot be performed exactly. The Variational Bayes approach, proposed as an approximation of the Baysian learning, has provided computational tractability and good generalization performance in many applications. In spite of these advantages, the properties and capabilities of the Variational Bayes learning itself have not been clarified yet. It is still unknown how good approximation the Variational Bayes approach can achieve. In this paper, we discuss the Variational Bayes learning of Gaussian mixture models and derive the lower bounds of the stochastic complexities. Stochastic complexity not only becomes important in addressing the model selection problem but also enables us to discuss the accuracy of the Variational Bayes approach as an approximation of the true Bayesian learning.