Fast aggregation of student mixture models

Studies on Mixtures of Student (t-)distributions have demonstrated their ability to conduct clustering tasks with valuable robustness to outliers, compared to their Gaussian mixture counterparts. Concurrently, distributed clustering has motivated much interest in methods for building a partition by consensus of multiple partitions. This paper addresses the latter need by aggregating mixtures of Student distributions. It involves minimizing iteratively an approximate KL divergence between mixtures, which themselves approximate each Student component as a finite Gaussian mixture.

[1]  John R. Hershey,et al.  Approximating the Kullback Leibler Divergence Between Gaussian Mixture Models , 2007, 2007 IEEE International Conference on Acoustics, Speech and Signal Processing - ICASSP '07.

[2]  Pierrick Bruneau,et al.  Parameter-based reduction of Gaussian mixture models with a variational-Bayes approach , 2008, 2008 19th International Conference on Pattern Recognition.

[3]  Jacob Goldberger,et al.  Hierarchical Clustering of a Mixture Model , 2004, NIPS.

[4]  Christopher M. Bishop,et al.  Robust Bayesian Mixture Modelling , 2005, ESANN.

[5]  Shiri Gordon,et al.  An efficient image similarity measure based on approximations of KL-divergence between two gaussian mixtures , 2003, Proceedings Ninth IEEE International Conference on Computer Vision.

[6]  Mohamed S. Kamel,et al.  A Probabilistic Model Using Information Theoretic Measures for Cluster Ensembles , 2004, Multiple Classifier Systems.

[7]  Michel Verleysen,et al.  Robust Bayesian clustering , 2007, Neural Networks.