Unsupervised learning of finite mixtures using entropy regularization and its application to image segmentation

When fitting finite mixtures to multivariate data, it is crucial to select the appropriate number of components. Under regularization theory, we aim to resolve this ldquounsupervisedrdquo learning problem via regularizing the likelihood by the full entropy of posterior probabilities for finite mixture fitting. Two deterministic annealing implementations are further proposed for this entropy regularized likelihood (ERL) learning. Through some asymptotic analysis of the deterministic annealing ERL (DAERL) learning, we find that the global minimization of the ERL function in an annealing way can lead to automatic model selection on finite mixtures and also make our DAERL algorithms less sensitive to initialization than the standard EM algorithm. The simulation experiments then demonstrate that our algorithms can provide some promising results just as our theoretic analysis. Moreover, our algorithms are evaluated in the application of unsupervised image segmentation and shown to outperform other state-of-the-art methods.

[1]  Naonori Ueda,et al.  Deterministic annealing EM algorithm , 1998, Neural Networks.

[2]  David L. Dowe,et al.  Minimum Message Length and Kolmogorov Complexity , 1999, Comput. J..

[3]  R. Redner,et al.  Mixture densities, maximum likelihood, and the EM algorithm , 1984 .

[4]  G. McLachlan On Bootstrapping the Likelihood Ratio Test Statistic for the Number of Components in a Normal Mixture , 1987 .

[5]  Stephen J. Roberts,et al.  Minimum-Entropy Data Partitioning Using Reversible Jump Markov Chain Monte Carlo , 2001, IEEE Trans. Pattern Anal. Mach. Intell..

[6]  Jitendra Malik,et al.  Blobworld: Image Segmentation Using Expectation-Maximization and Its Application to Image Querying , 2002, IEEE Trans. Pattern Anal. Mach. Intell..

[7]  Geoffrey J. McLachlan,et al.  Finite Mixture Models , 2019, Annual Review of Statistics and Its Application.

[8]  Martial Hebert,et al.  Toward Objective Evaluation of Image Segmentation Algorithms , 2007, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[9]  Anil K. Jain,et al.  Unsupervised Learning of Finite Mixture Models , 2002, IEEE Trans. Pattern Anal. Mach. Intell..

[10]  Matthew Brand,et al.  Structure Learning in Conditional Probability Models via an Entropic Prior and Parameter Extinction , 1999, Neural Computation.

[11]  J. Rissanen,et al.  Modeling By Shortest Data Description* , 1978, Autom..

[12]  Nikolas P. Galatsanos,et al.  A Class-Adaptive Spatially Variant Mixture Model for Image Segmentation , 2007, IEEE Transactions on Image Processing.

[13]  Larry S. Davis,et al.  Density Estimation Using Mixtures of Mixtures of Gaussians , 2006, ECCV.

[14]  A. N. Tikhonov,et al.  Solutions of ill-posed problems , 1977 .

[15]  Jitendra Malik,et al.  A database of human segmented natural images and its application to evaluating segmentation algorithms and measuring ecological statistics , 2001, Proceedings Eighth IEEE International Conference on Computer Vision. ICCV 2001.