The Evolution of a Feedforward Neural Network trained under Backpropagation

This paper presents a theoretical and empirical analysis of the evolution of a feedforward neural network (FFNN) trained using backpropagation (BP). The results of two sets of experiments axe presented which illustrate the nature of BP’s search through weight space as the network learns to classify the training data. The search is shown to be driven by the initial values of the weights in the output layer of neurons.