暂无分享,去创建一个
[1] Shaofeng Zou,et al. Tightening Mutual Information Based Bounds on Generalization Error , 2019, 2019 IEEE International Symposium on Information Theory (ISIT).
[2] Maxim Raginsky,et al. Information-theoretic analysis of generalization capability of learning algorithms , 2017, NIPS.
[3] F. Alajaji,et al. Lectures Notes in Information Theory , 2000 .
[4] Mikael Skoglund,et al. Upper Bounds on the Generalization Error of Private Algorithms for Discrete Data , 2020, IEEE Transactions on Information Theory.
[5] Gintare Karolina Dziugaite,et al. Information-Theoretic Generalization Bounds for SGLD via Data-Dependent Estimates , 2019, NeurIPS.
[6] Giuseppe Durisi,et al. Generalization Bounds via Information Density and Conditional Information Density , 2020, IEEE Journal on Selected Areas in Information Theory.
[7] Thomas Steinke,et al. Reasoning About Generalization via Conditional Mutual Information , 2020, COLT.
[8] Gintare Karolina Dziugaite,et al. Sharpened Generalization Bounds based on Conditional Mutual Information and an Application to Noisy, Iterative Algorithms , 2020, NeurIPS.
[9] Shai Ben-David,et al. Understanding Machine Learning: From Theory to Algorithms , 2014 .
[10] James Zou,et al. How Much Does Your Data Exploration Overfit? Controlling Bias via Information Usage , 2015, IEEE Transactions on Information Theory.