On complexity analysis of supervised MLP-learning for algorithmic comparisons
暂无分享,去创建一个
[1] Stuart E. Dreyfus,et al. On derivation of MLP backpropagation from the Kelley-Bryson optimal-control gradient formula and its application , 2000, Proceedings of the IEEE-INNS-ENNS International Joint Conference on Neural Networks. IJCNN 2000. Neural Computing: New Challenges and Perspectives for the New Millennium.
[2] Peter Tang,et al. The Computation of Transcendental Functions on the IA-64 Architecture , 1999 .
[3] Adrian J. Shepherd,et al. Second-order methods for neural networks - fast and reliable training methods for multi-layer perceptrons , 1997, Perspectives in neural computing.
[4] Toshinobu Yoshida. Rapid learning method for multilayered neural networks using two-dimensional conjugate gradient search , 1992 .
[5] Martin Fodslette Møller,et al. A scaled conjugate gradient algorithm for fast supervised learning , 1993, Neural Networks.
[6] Adrian J. Shepherd,et al. Second-Order Methods for Neural Networks , 1997 .
[7] James Demmel,et al. On Iterative Krylov-Dogleg Trust-Region Steps for Solving Neural Networks Nonlinear Least Squares Problems , 2000, NIPS.
[8] Martin Bouchard,et al. New recursive-least-squares algorithms for nonlinear active control of sound and vibration using neural networks , 2001, IEEE Trans. Neural Networks.
[9] Geoffrey E. Hinton,et al. Learning internal representations by error propagation , 1986 .
[10] George D. Magoulas,et al. Improving the Convergence of the Backpropagation Algorithm Using Learning Rate Adaptation Methods , 1999, Neural Computation.