Gradient-free two-points optimal method for non smooth stochastic convex optimization problem with additional small noise
暂无分享,去创建一个
Using double-smoothing technique and stochastic mirror descent with inexact oracle we built an optimal algorithm (up to a multiplicative factor) for two-points gradient-free non-smooth stochastic convex programming. We investigate how much can be the level of noise (the nature of this noise isn't necessary stochastic) for the rate of convergence to be maintained (up to a multiplicative factor).