Efficacy of different learning algorithms of the back-propagation network
暂无分享,去创建一个
The adaptive training algorithm, the delta-bar-delta method, and the conjugate gradient method are suggested to improve the training speed of the back-propagation network. Comparisons of these methods are made their practical effectiveness in terms of training speed, storage requirement and the possibility of hardware implementation is analyzed and discussed. It is shown that the adaptive training method is fastest and the conjugate gradient slowest when they are applied in two examples. The robustness of both the adaptive training and the delta-bar-delta method against parameters is also considered.<<ETX>>
[1] C. M. Reeves,et al. Function minimization by conjugate gradients , 1964, Comput. J..
[2] Frank Fallside,et al. An adaptive training algorithm for back propagation networks , 1987 .
[3] Robert A. Jacobs,et al. Increased rates of convergence through learning rate adaptation , 1987, Neural Networks.