Signal processing with radial basis function networks using expectation-maximization algorithm clustering

Several researchers have investigated the usefulness ofradial basis function (RBF) neural networks for prediction of difficult (chaotic) time series.18 This paper demonstrates adaptive unsupervised / supervised learning procedures which improve on the RBF network performance in terms of generalization, and final mean-square error. After a description of the RBF network architecture, the learning of the network is discussed with comments on the advantages and disadvantages of various approaches. Unsupervised clustering of RBF centres has been used by researchers6'° to overcome some of the disadvantages in supervised learning, and to improve the performance of the RBF network while keeping the number of basis functions relatively low. A maximum likelihood solution for unsupervised clustering called the expectation maximization (EM) algorithm has recently been used'° to estimate the parameters of the RBF hidden layer units. The input density is modeled as a mixture of component gaussian distributions. The estimated parameters of the mixture density are then transplanted into the RBF network, after which supervised learning of the network takes place. An extension to input—output space clustering is shown to be superior. Although papers have been written on input space clustering for RBF networks, this paper examines why the extended metric clustering should be a superior methodology. Two examples, a chaotic time series predictor and a cross—polarization canceller, illustrate the performance of the algorithms. The EM algorithm with extended metric clustering method achieves superior performance on signal processing problems by creating a better hidden layer representation in areas of the input space where samples are more likely to be jointly located.