The Incremental Gauss-Newton Algorithm with Adaptive Stepsize Rule
暂无分享,去创建一个
Masao Fukushima | Nobuo Yamashita | Hiroyuki Moriyama | M. Fukushima | N. Yamashita | Hiroyuki Moriyama
[1] Dimitri P. Bertsekas,et al. A New Class of Incremental Gradient Methods for Least Squares Problems , 1997, SIAM J. Optim..
[2] W. Pitts,et al. A Logical Calculus of the Ideas Immanent in Nervous Activity (1943) , 2021, Ideas That Created the Future.
[3] Dimitri P. Bertsekas,et al. Incremental Least Squares Methods and the Extended Kalman Filter , 1996, SIAM J. Optim..
[4] O. Nelles,et al. An Introduction to Optimization , 1996, IEEE Antennas and Propagation Magazine.
[5] Geoffrey E. Hinton,et al. Learning internal representations by error propagation , 1986 .
[6] Paul Tseng,et al. An Incremental Gradient(-Projection) Method with Momentum Term and Adaptive Stepsize Rule , 1998, SIAM J. Optim..
[7] O. Mangasarian,et al. Serial and parallel backpropagation convergence via nonmonotone perturbed minimization , 1994 .
[8] John E. Dennis,et al. Numerical methods for unconstrained optimization and nonlinear equations , 1983, Prentice Hall series in computational mathematics.
[9] Dimitri P. Bertsekas,et al. Nonlinear Programming , 1997 .
[10] Zhi-Quan Luo,et al. On the Convergence of the LMS Algorithm with Adaptive Learning Rate for Linear Feedforward Networks , 1991, Neural Computation.