Global Convergence Rate Analysis of a Generic Line Search Algorithm with Noise
暂无分享,去创建一个
[1] R. Carter. On the global convergence of trust region algorithms using inexact gradient information , 1991 .
[2] Stephen J. Wright,et al. Numerical Optimization (Springer Series in Operations Research and Financial Engineering) , 2000 .
[3] Yurii Nesterov,et al. Introductory Lectures on Convex Optimization - A Basic Course , 2014, Applied Optimization.
[4] Adam Tauman Kalai,et al. Online convex optimization in the bandit setting: gradient descent without a gradient , 2004, SODA '05.
[5] Stefan M. Wild,et al. Benchmarking Derivative-Free Optimization Algorithms , 2009, SIAM J. Optim..
[6] C. T. Kelley,et al. Implicit Filtering , 2011 .
[7] Jorge Nocedal,et al. Sample size selection in optimization methods for machine learning , 2012, Math. Program..
[8] Yurii Nesterov,et al. Random Gradient-Free Minimization of Convex Functions , 2015, Foundations of Computational Mathematics.
[9] Xi Chen,et al. Evolution Strategies as a Scalable Alternative to Reinforcement Learning , 2017, ArXiv.
[10] A Stochastic Line Search Method with Convergence Rate Analysis , 2018, 1807.07994.
[11] Sham M. Kakade,et al. Global Convergence of Policy Gradient Methods for the Linear Quadratic Regulator , 2018, ICML.
[12] Irina S. Dolinskaya,et al. A Derivative-Free Trust-Region Algorithm for the Optimization of Functions Smoothed via Gaussian Convolution Using Adaptive Multiple Importance Sampling , 2018, SIAM J. Optim..
[13] Peter W. Glynn,et al. On Sampling Rates in Simulation-Based Recursions , 2018, SIAM J. Optim..
[14] Katya Scheinberg,et al. Global convergence rate analysis of unconstrained optimization methods based on probabilistic models , 2015, Mathematical Programming.
[15] Stefan M. Wild,et al. Derivative-free optimization methods , 2019, Acta Numerica.
[16] Jorge Nocedal,et al. Derivative-Free Optimization of Noisy Functions via Quasi-Newton Methods , 2018, SIAM J. Optim..
[17] K. Scheinberg,et al. A Theoretical and Empirical Comparison of Gradient Approximations in Derivative-Free Optimization , 2019, Foundations of Computational Mathematics.