A New Fast Learning Algorithm for Multi-Layer Feedforward Neural Networks

The strong nonlinear relation between the training sample's impact on the errors and error's derivatives is the fundamental reason underlying the low learning efficiency of the multi-layer forward neural networks. Effectively decreasing the degree of the nonlinear relation and its impact on network learning is critical to improve the neural network's training efficiency. Based on the above idea, this paper propose a new approach to accelerate learning efficiency, including linearization technique of the non-linear relation, the convergence technique based on the local equalization of training sample's errors, and the rotation adjustment of the weights. A new fast learning algorithm for the multi-layer forward neural networks is also presented. The experimental results prove that the new algorithm is capable of shortening the training time by hundreds time and remarkably improve generalization of the neural networks, compared with the conventional algorithms