A note on the global convergence theorem of the scaled conjugate gradient algorithms proposed by Andrei
暂无分享,去创建一个
[1] P. Wolfe. Convergence Conditions for Ascent Methods. II , 1969 .
[2] J. Borwein,et al. Two-Point Step Size Gradient Methods , 1988 .
[3] Ya-Xiang Yuan,et al. Convergence Properties of Nonlinear Conjugate Gradient Methods , 1999, SIAM J. Optim..
[4] Neculai Andrei,et al. Scaled conjugate gradient algorithms for unconstrained optimization , 2007, Comput. Optim. Appl..
[5] Neculai Andrei,et al. Accelerated scaled memoryless BFGS preconditioned conjugate gradient algorithm for unconstrained optimization , 2010, Eur. J. Oper. Res..
[6] Neculai Andrei,et al. Scaled memoryless BFGS preconditioned conjugate gradient algorithm for unconstrained optimization , 2007, Optim. Methods Softw..
[7] David F. Shanno,et al. Conjugate Gradient Methods with Inexact Searches , 1978, Math. Oper. Res..
[8] Ya-Xiang Yuan,et al. Optimization Theory and Methods: Nonlinear Programming , 2010 .
[9] P. Wolfe. Convergence Conditions for Ascent Methods. II: Some Corrections , 1971 .
[10] J. M. Martínez,et al. A Spectral Conjugate Gradient Method for Unconstrained Optimization , 2001 .