暂无分享,去创建一个
Tianbao Yang | Lijun Zhang | Zhe Li | Tianbao Yang | Lijun Zhang | Zhe Li
[1] Elad Hazan,et al. Logarithmic regret algorithms for online convex optimization , 2006, Machine Learning.
[2] Guanghui Lan,et al. An optimal method for stochastic composite optimization , 2011, Mathematical Programming.
[3] Claudio Gentile,et al. On the generalization ability of on-line learning algorithms , 2001, IEEE Transactions on Information Theory.
[4] Lin Xiao,et al. A Proximal Stochastic Gradient Method with Progressive Variance Reduction , 2014, SIAM J. Optim..
[5] Robert C. Williamson,et al. From Stochastic Mixability to Fast Rates , 2014, NIPS.
[6] H. Kushner,et al. Stochastic Approximation and Recursive Algorithms and Applications , 2003 .
[7] Nishant Mehta,et al. Fast rates with high probability in exp-concave statistical learning , 2016, AISTATS.
[8] S. Smale,et al. Learning Theory Estimates via Integral Operators and Their Approximations , 2007 .
[9] G. Pisier. The volume of convex bodies and Banach space geometry , 1989 .
[10] Vitaly Feldman,et al. Generalization of ERM in Stochastic Convex Optimization: The Dimension Strikes Back , 2016, NIPS.
[11] Rong Jin,et al. Lower and Upper Bounds on the Generalization of Stochastic Exponentially Concave Optimization , 2015, COLT.
[12] Kfir Y. Levy,et al. Fast Rates for Exp-concave Empirical Risk Minimization , 2015, NIPS.
[13] Vladimir Vapnik,et al. Statistical learning theory , 1998 .
[14] Ohad Shamir,et al. Stochastic Convex Optimization , 2009, COLT.
[15] Yurii Nesterov,et al. Introductory Lectures on Convex Optimization - A Basic Course , 2014, Applied Optimization.
[16] Nathan Srebro,et al. Fast Rates for Regularized Objectives , 2008, NIPS.
[17] Rong Jin,et al. Empirical Risk Minimization for Stochastic Convex Optimization: $O(1/n)$- and $O(1/n^2)$-type of Risk Bounds , 2017, COLT.
[18] Yaniv Plan,et al. One‐Bit Compressed Sensing by Linear Programming , 2011, ArXiv.
[19] Shai Shalev-Shwartz,et al. Average Stability is Invariant to Data Preconditioning. Implications to Exp-concave Empirical Risk Minimization , 2016, J. Mach. Learn. Res..
[20] Francis Bach,et al. SAGA: A Fast Incremental Gradient Method With Support for Non-Strongly Convex Composite Objectives , 2014, NIPS.
[21] Ambuj Tewari,et al. On the Generalization Ability of Online Strongly Convex Programming Algorithms , 2008, NIPS.