暂无分享,去创建一个
Tomaso A. Poggio | Brando Miranda | Qianli Liao | Andrzej Banburski | Jack Hidary | T. Poggio | B. Miranda | Q. Liao | J. Hidary | Andrzej Banburski
[1] Peter L. Bartlett,et al. Neural Network Learning - Theoretical Foundations , 1999 .
[2] Gábor Lugosi,et al. Introduction to Statistical Learning Theory , 2004, Advanced Lectures on Machine Learning.
[3] Michael I. Jordan,et al. Convexity, Classification, and Risk Bounds , 2006 .
[4] Ambuj Tewari,et al. On the Complexity of Linear Prediction: Risk Bounds, Margin Bounds, and Regularization , 2008, NIPS.
[5] Klaus-Robert Müller,et al. Efficient BackProp , 2012, Neural Networks: Tricks of the Trade.
[6] Jian Sun,et al. Delving Deep into Rectifiers: Surpassing Human-Level Performance on ImageNet Classification , 2015, 2015 IEEE International Conference on Computer Vision (ICCV).
[7] Samy Bengio,et al. Understanding deep learning requires rethinking generalization , 2016, ICLR.
[8] Tomaso A. Poggio,et al. Theory II: Landscape of the Empirical Risk in Deep Learning , 2017, ArXiv.
[9] Matus Telgarsky,et al. Spectrally-normalized margin bounds for neural networks , 2017, NIPS.
[10] Luca Antiga,et al. Automatic differentiation in PyTorch , 2017 .
[11] Lorenzo Rosasco,et al. Theory of Deep Learning III: explaining the non-overfitting puzzle , 2017, ArXiv.
[12] Tomaso Poggio,et al. Classical generalization bounds are surprisingly tight for Deep Networks , 2018 .
[13] Ohad Shamir,et al. Size-Independent Sample Complexity of Neural Networks , 2017, COLT.
[14] Tomaso A. Poggio,et al. Theory of Deep Learning IIb: Optimization Properties of SGD , 2018, ArXiv.
[15] Tomaso A. Poggio,et al. Theory IIIb: Generalization in Deep Networks , 2018, ArXiv.
[16] Tomaso A. Poggio,et al. Fisher-Rao Metric, Geometry, and Complexity of Neural Networks , 2017, AISTATS.