Classification using hierarchical mixtures of experts

There has recently been widespread interest in the use of multiple models for classification and regression in the statistics and neural networks communities. The hierarchical mixture of experts (HME) has been successful in a number of regression problems, yielding significantly faster training through the use of the expectation maximisation algorithm. In this paper we extend the HME to classification and results are reported for three common classification benchmark tests: exclusive-OR, N-input parity and two spirals.<<ETX>>