暂无分享,去创建一个
Ioannis Mitliagkas | Gintare Karolina Dziugaite | Daniel M. Roy | Ethan Caballero | Alexandre Drouin | Brady Neal | Nitarshan Rajkumar | Linbo Wang
[1] John D. Hunter,et al. Matplotlib: A 2D Graphics Environment , 2007, Computing in Science & Engineering.
[2] Hossein Mobahi,et al. Predicting the Generalization Gap in Deep Networks with Margin Distributions , 2018, ICLR.
[3] J. Zico Kolter,et al. Uniform convergence may be unable to explain generalization in deep learning , 2019, NeurIPS.
[4] Tomaso A. Poggio,et al. A Surprising Linear Relationship Predicts Test Performance in Deep Networks , 2018, ArXiv.
[5] David Lopez-Paz,et al. Invariant Risk Minimization , 2019, ArXiv.
[6] Qiang Chen,et al. Network In Network , 2013, ICLR.
[7] N. Meinshausen,et al. Anchor regression: Heterogeneous data meet causality , 2018, Journal of the Royal Statistical Society: Series B (Statistical Methodology).
[8] Jonas Peters,et al. Causal inference by using invariant prediction: identification and confidence intervals , 2015, 1501.01332.
[9] P. Bühlmann,et al. Invariance, Causality and Robustness , 2018, Statistical Science.
[10] Robert B. Ash,et al. Information Theory , 2020, The SAGE International Encyclopedia of Mass Media and Society.
[11] Yuichi Yoshida,et al. Spectral Norm Regularization for Improving the Generalizability of Deep Learning , 2017, ArXiv.
[12] Nicolai Meinshausen,et al. CAUSALITY FROM A DISTRIBUTIONAL ROBUSTNESS POINT OF VIEW , 2018, 2018 IEEE Data Science Workshop (DSW).
[13] K. Jarrod Millman,et al. Array programming with NumPy , 2020, Nat..
[14] Philip M. Long,et al. The Singular Values of Convolutional Layers , 2018, ICLR.
[15] Andrew Y. Ng,et al. Reading Digits in Natural Images with Unsupervised Feature Learning , 2011 .
[16] Wes McKinney,et al. Data Structures for Statistical Computing in Python , 2010, SciPy.
[17] Ryota Tomioka,et al. Norm-Based Capacity Control in Neural Networks , 2015, COLT.
[18] Yi Zhang,et al. Stronger generalization bounds for deep nets via a compression approach , 2018, ICML.
[19] Alex Krizhevsky,et al. Learning Multiple Layers of Features from Tiny Images , 2009 .
[20] Hossein Mobahi,et al. Fantastic Generalization Measures and Where to Find Them , 2019, ICLR.
[21] Gintare Karolina Dziugaite,et al. Computing Nonvacuous Generalization Bounds for Deep (Stochastic) Neural Networks with Many More Parameters than Training Data , 2017, UAI.
[22] David J. C. MacKay,et al. Information Theory, Inference, and Learning Algorithms , 2004, IEEE Transactions on Information Theory.
[23] Yann LeCun,et al. Towards Understanding the Role of Over-Parametrization in Generalization of Neural Networks , 2018, ArXiv.
[24] Gaël Varoquaux,et al. Scikit-learn: Machine Learning in Python , 2011, J. Mach. Learn. Res..
[25] Bernhard Schölkopf,et al. Invariant Models for Causal Transfer Learning , 2015, J. Mach. Learn. Res..
[26] Tomaso A. Poggio,et al. Fisher-Rao Metric, Geometry, and Complexity of Neural Networks , 2017, AISTATS.
[27] Shai Ben-David,et al. Understanding Machine Learning: From Theory to Algorithms , 2014 .
[28] Matus Telgarsky,et al. Spectrally-normalized margin bounds for neural networks , 2017, NIPS.
[29] Samy Bengio,et al. Understanding deep learning requires rethinking generalization , 2016, ICLR.
[30] Natalia Gimelshein,et al. PyTorch: An Imperative Style, High-Performance Deep Learning Library , 2019, NeurIPS.
[31] L. Goddard. Information Theory , 1962, Nature.