Black-box optimization of noisy functions with unknown smoothness
暂无分享,去创建一个
[1] Csaba Szepesvári,et al. –armed Bandits , 2022 .
[2] Adam D. Bull,et al. Adaptive-treed bandits , 2013, 1302.2489.
[3] Eli Upfal,et al. Multi-Armed Bandits in Metric Spaces ∗ , 2008 .
[4] Rémi Munos,et al. Optimistic Optimization of Deterministic Functions , 2011, NIPS 2011.
[5] Philippe Preux,et al. Bandits attack function optimization , 2014, 2014 IEEE Congress on Evolutionary Computation (CEC).
[6] Alexandre Proutière,et al. Unimodal Bandits without Smoothness , 2014, ArXiv.
[7] Rémi Munos,et al. Pure exploration in finitely-armed and continuous-armed bandits , 2011, Theor. Comput. Sci..
[8] Rémi Munos,et al. From Bandits to Monte-Carlo Tree Search: The Optimistic Principle Applied to Optimization and Planning , 2014, Found. Trends Mach. Learn..
[9] Rémi Munos,et al. Stochastic Simultaneous Optimistic Optimization , 2013, ICML.
[10] Alessandro Lazaric,et al. Online Stochastic Optimization under Correlated Bandit Feedback , 2014, ICML.
[11] Jia Yuan Yu,et al. Lipschitz Bandits without the Lipschitz Constant , 2011, ALT.
[12] Peter Auer,et al. Finite-time Analysis of the Multiarmed Bandit Problem , 2002, Machine Learning.
[13] Rémi Munos,et al. Bandit Algorithms for Tree Search , 2007, UAI.
[14] Aleksandrs Slivkins,et al. Multi-armed bandits on implicit metric spaces , 2011, NIPS.
[15] Csaba Szepesvári,et al. Bandit Based Monte-Carlo Planning , 2006, ECML.
[16] Alexandre Proutière,et al. Unimodal Bandits: Regret Lower Bounds and Optimal Algorithms , 2014, ICML.