Combination of radial basis function neural networks with optimized learning vector quantization

Randomly initialized radial basis function neural networks are compared to networks whose centers are obtained by using vector quantization. It is shown that the error rate for small networks can be decreased by about 28%. To achieve the same performance with a trained network as with a randomly initialized network, only half of the number of hidden neurons is needed. This may be important for time critical applications. The time used for the training and initialization of a smaller network is comparable to the time used for the initialization of a larger network.<<ETX>>