Oscillation-Resisting in the Learning of Backpropagation Neural Networks

Abstract The paper presents a simple and efficient learning algorithm for backpropagation networks. As it is well known, the standard backpropagation learning algorithm that minimises the sum of squared errors defined over a set of training data, can lead to the problem of slow convergence and the trap at the local minima. It can result into an oscillating learning process. In order to avoid this, a new performance index is proposed for learning purpose that not only speeds up the learning dynamics, but also avoids the minima at higher energy level and converge to one at a lower energy level. The advantages of the learning properties at the new performance index are strongly supported by the simulation results, included into this paper.