A Backpropagation Algorithm as a Successive Projection Method
暂无分享,去创建一个
Error back propagation is one of the most popular ideas used in learning algorithms for multilayer neural networks. Since the pioneering work by Rumelhart, Hinton and Williams, error back propagation has been regarded as a gradient descent method or its close approximation to minimize the sum-squared error function. In this paper, we point out that "on-line" back propagation is better interpreted as a successive projection method for solving a system of nonlinear ineqnalities. In particular we proposed a new learning algorithm based on the successive projection method, in which the stepsize is determined quantitatively based on the magnitude of error observed for each input pattern. Some simulation results on XOR and parity check problems indicate that the propsed algorithm is more effective and robust than the standard on-line back propagation algorithm.