A gradient sampling method with complexity guarantees for general Lipschitz functions
暂无分享,去创建一个
[1] Dmitriy Drusvyatskiy,et al. Stochastic model-based minimization of weakly convex functions , 2018, SIAM J. Optim..
[2] Edouard Pauwels,et al. A mathematical model for automatic differentiation in machine learning , 2020, NeurIPS.
[3] Suvrit Sra,et al. Complexity of Finding Stationary Points of Nonconvex Nonsmooth Functions , 2020, ICML.
[4] Adrian S. Lewis,et al. A Robust Gradient Sampling Algorithm for Nonsmooth, Nonconvex Optimization , 2005, SIAM J. Optim..
[5] Edouard Pauwels,et al. Conservative set valued fields, automatic differentiation, stochastic gradient method and deep learning , 2019, ArXiv.
[6] Dmitriy Drusvyatskiy,et al. Pathological Subgradient Dynamics , 2019, SIAM J. Optim..
[7] A. A. Goldstein,et al. Optimization of lipschitz continuous functions , 1977, Math. Program..
[8] Michael L. Overton,et al. Gradient Sampling Methods for Nonsmooth Optimization , 2018, Numerical Nonsmooth Optimization.