On Iterative Krylov-Dogleg Trust-Region Steps for Solving Neural Networks Nonlinear Least Squares Problems
暂无分享,去创建一个
[1] P. Toint,et al. On Large Scale Nonlinear Least Squares Calculations , 1987 .
[2] Martin Fodslette Møller,et al. A scaled conjugate gradient algorithm for fast supervised learning , 1993, Neural Networks.
[3] Howard B. Demuth,et al. Neutral network toolbox for use with Matlab , 1995 .
[4] Adrian J. Shepherd,et al. Second-Order Methods for Neural Networks , 1997 .
[5] Adrian J. Shepherd,et al. Second-order methods for neural networks - fast and reliable training methods for multi-layer perceptrons , 1997, Perspectives in neural computing.
[6] Barak A. Pearlmutter. Fast Exact Multiplication by the Hessian , 1994, Neural Computation.
[7] Timothy Masters,et al. Advanced algorithms for neural networks: a C++ sourcebook , 1995 .
[8] N. Katoh,et al. Color device characterization of electronic cameras by solving adaptive networks nonlinear least squares problems , 1999, FUZZ-IEEE'99. 1999 IEEE International Fuzzy Systems. Conference Proceedings (Cat. No.99CH36315).
[9] T. Steihaug. The Conjugate Gradient Method and Trust Regions in Large Scale Optimization , 1983 .
[10] Eiji Mizutani,et al. Powell's dogleg trust-region steps with the quasi-Newton augmented Hessian for neural nonlinear least-squares learning , 1999, IJCNN'99. International Joint Conference on Neural Networks. Proceedings (Cat. No.99CH36339).
[11] James Demmel,et al. Applied Numerical Linear Algebra , 1997 .
[12] M. Powell. A New Algorithm for Unconstrained Optimization , 1970 .
[13] Trond Steihaug,et al. Truncated-newtono algorithms for large-scale unconstrained optimization , 1983, Math. Program..