An automated design system for finding the minimal configuration of a feed-forward neural network

In this paper, we present a method for finding the minimal configuration of a feedforward artificial neural network (ANN) for solving a given application problem. We assume that the cascade-correlation (CAS) training algorithm is used to train the weights of the ANNs concerned. Under a given time constraint that is long enough to train tens of ANNs completely, we divide the time into quanta, and present a method for scheduling dynamically the ANN to be trained in each quantum from a pool of partially trained ANNs. Our goal is to find an ANN configuration with smaller number of hidden units as compared to the alternative of applying the CAS algorithm repeatedly to train each ANN to completion before exploring new ANNs. Our system is a population-based generate-and-test method that maintains a population of candidate ANNs, and that selectively train those that are predicted to require smaller configurations. Since it is difficult to predict the exact number of hidden units required when the CAS algorithm terminates, our system compares two partially trained ANNs and predicts which one will converge with a smaller number of hidden units relative to the other. Our prediction mechanism is based on a comparator neural network (CANN) that takes as inputs the TSSE-versus-time behavior of training performed already on two ANNs, and that predicts which one will require a smaller number of hidden units when convergence is reached. We show that our CANN can predict correctly most of the time, and present experimental results on better configurations found in a given time limit for a classification problem and the two-spiral problem.<<ETX>>