Backpropagation Growing Networks: Towards Local Minima Elimination

The problem of local minima in backpropagation [Rumelhart 86] based networks is addressed. Networks that should have fast learning behavior fall into large number of epochs learning. This is usually produced by a combination of initial conditions (network size, training set relative to the global problem size, learning factors, etc...) that forces the network to fall into local minima, large valleys or huge plateaus of the error surface. Several procedures have been proposed to solve this problem, some of them affecting the training set, others taking advantage of problem preprocessing to find the initial position of the network. We will show a fully local method to avoid the problem: the detection (by the neuron) of the local minimum in which it may be involved. Once this step is covered, several methods to pick the network out of the minimum are proposed. The first method is to increase the network size by producing new neurons (Meiosis[Hanson 90b]), other compatible methods are presented too.