Asymptotic Complexity of an RBF NN for Correlated Data Representation

We address here the problem of architecture selection for an RBF network designed for classification purposes. Given a training set, the RBF network produces an estimate of the Probability Density Function (PDF) in terms of a mixture of l uncorrelated Gaussian functions, where l is the number of hidden neurons. Using uncorrelated Gaussians alleviates the heavy computational burden of estimating the full covariance matrix. However, the simplicity of such building blocks has to be paid for by the relatively large numbers of units needed to approximate the density of correlated data. We define two scalar parameters to describe the complexity of the data to be modelled and study the relationship between the complexity of the data and the complexity of the best approximating network.