A new fast neural network training algorithm based on a modified standard backpropagation (MBP) criterion minimization

In this work a new approach for the learning process of multilayer perceptron Neural Networks (NN) is proposed. This approach minimizes a modified form of the criterion used in the standard back-propagation algorithm (SBP) formed by the sum of the linear and the nonlinear quadratic errors of the neuron. To determine the desired target in the hidden layers an analog back-propagation strategy used in the conventional learning algorithms is developed. This permits the application of the learning procedure to all the layers. Simulation results on the 4-byte parity checker and the circle in the square problem are obtained which indicate significant reduction in the total number of iterations when compared to those of the SBP algorithm.