Pruning neural networks by minimization of the estimated variance

This paper presents a series of results on a method of pruning neural networks. An approximation to the estimated variance of errors, V, is constructed containing a supplementary parameter, a - the estimated variance itself being the limit of the function, V, as a tends to zero. The network weights are fitted using a minimization algorithm with V as objective function. The parameter, a, is reduced successively in the course of fitting. Results are presented using synthetic functions and the well-known airline passenger data. We find, for example, that the network can discover, in the course of being pruned, evidence of redundancy in the variables.