Fast /spl alpha/-weighted EM learning for neural networks of module mixtures

A class of extended logarithms is used to derive /spl alpha/-weighted EM (/spl alpha/-weighted expectation-maximization) algorithms. These extended EM algorithms (WEMs, /spl alpha/-EMs) have been anticipated to outperform the traditional (logarithmic) EM algorithm on speed. The traditional approach falls into a special case of the new WEM. In this paper, general theoretical discussions are given first. Then, clear-cut evidence that shows faster convergence than the ordinary EM approach are given for the case of mixture-of-expert neural networks. This process takes three steps. The first step is to show specific algorithms. Then, the convergence is theoretically checked. Thirdly, experiments on the mixture-of-expert learning are tried to show the superiority of the WEM. Besides the supervised learning, the unsupervised case for a Gaussian mixture is also examined. Faster convergence of the WEM is observed again.