Fast layer-by-layer training of the feedforward neural network classifier with genetic algorithm

Because the error backpropagation learning algorithm is based on the steepest descent technique to train feedforward neural networks, its rate of convergence is slow due to the problem of local minima. We propose a new learning method for pattern classification using genetic algorithm and optimizing interconnection weights layer by layer by adding hidden layers one by one. Computer simulation shows that the layer-by-layer learning method has the fast convergence rate at the sacrifice of the size of the network.