Accelerating the standard backpropagation method using a genetic approach

Abstract Backpropagation is a well-known method for training feedforward neural networks and it usually requires a large number of iterations when a neural network is trained. In this paper it is shown that this number can be extremely reduced if after each iteration step the weights are changed by a simple mutation as it is done in genetic algorithms. In the examples described the computing time was reduced to 5.5% of backpropagation time in the average. The average number of iterations is reduced from 120 000 (backpropagation) to 1400.