A MATLAB-Based Study on Approximation Performances of Improved Algorithms of Typical BP Neural Networks
暂无分享,去创建一个
BP neural networks are widely used and the algorithms are various. This paper studies the advantages and disadvantages of improved algorithms of five typical BP networks, based on artificial neural network theories. First, the learning processes of improved algorithms of the five typical BP networks are elaborated on mathematically. Then a specific network is designed on the platform of MATLAB 7.0 to conduct approximation test for a given nonlinear function. At last, a comparison is made between the training speeds and memory consumption of the five BP networks. The simulation results indicate that for small scaled and medium scaled networks, LM optimization algorithm has the best approximation ability, followed by Quasi-Newton algorithm, conjugate gradient method, resilient BP algorithm, adaptive learning rate algorithm. Keywords: BP neural network; Improved algorithm; Function approximation; MATLAB
[1] Martin Fodslette Møller,et al. A scaled conjugate gradient algorithm for fast supervised learning , 1993, Neural Networks.
[2] Su Gao. On the Improving Backpropagation Algorithms of the Neural Networks Based on MATLAB Language:A Review , 2003 .
[3] Xinmin Wang,et al. Research and application on improved BP neural network algorithm , 2010, 2010 5th IEEE Conference on Industrial Electronics and Applications.