Hierarchical Gaussian Mixture Model

Gaussian mixture models (GMMs) are a convenient and essential tool for the estimation of probability density functions. Although GMMs are used in many research domains from image processing to machine learning, this statistical mixture modeling is usually com- plex and further needs to be simplified. In this paper, we present a GMM simplification method based on a hierarchical clustering algo- rithm. Our method allows one to first to quickly compute a compact version of the initial GMM, and second to automatically learn the optimal number of components of the simplified GMM. Using the framework of Bregman divergences, this simplification algorithm, although presented here for GMMs, is suitable for any mixture of exponential families.

[1]  Frank Nielsen,et al.  Statistical exponential families: A digest with flash cards , 2009, ArXiv.

[2]  James T. Kwok,et al.  Simplifying Mixture Models Through Function Approximation , 2006, IEEE Transactions on Neural Networks.

[3]  Inderjit S. Dhillon,et al.  Clustering with Bregman Divergences , 2005, J. Mach. Learn. Res..

[4]  Frank Nielsen,et al.  Sided and Symmetrized Bregman Centroids , 2009, IEEE Transactions on Information Theory.

[5]  Frank Nielsen,et al.  Simplifying Gaussian mixture models via entropic quantization , 2009, 2009 17th European Signal Processing Conference.

[6]  Hayit Greenspan,et al.  Simplifying Mixture Models Using the Unscented Transform , 2008, IEEE Transactions on Pattern Analysis and Machine Intelligence.