Maximum Gaussian Mixture Model for Classification

There are a variety of models and algorithms that solves classification problems. Among these models, Maximum Gaussian Mixture Model (MGMM) is a model we proposed earlier that describes data using the maximum value of Gaussians. Expectation Maximization (EM) algorithm can be used to solve this model. In this paper, we propose a multiEM approach to solve MGMM and to train MGMM based classifiers. This approach combines multiple MGMMs solved by EM into a classifier. The classifiers trained with this approach on both artificial and real life datasets were tested to have good performance with 10-fold cross validation.