Individually Conditional Individual Mutual Information Bound on Generalization Error
暂无分享,去创建一个
[1] Michael Gastpar,et al. Strengthened Information-theoretic Bounds on the Generalization Error , 2019, 2019 IEEE International Symposium on Information Theory (ISIT).
[2] Gintare Karolina Dziugaite,et al. Information-Theoretic Generalization Bounds for SGLD via Data-Dependent Estimates , 2019, NeurIPS.
[3] Jonathan M. Nichols,et al. Calculation of Differential Entropy for a Mixed Gaussian Distribution , 2008, Entropy.
[4] Jonathan H. Manton,et al. Information-theoretic analysis for transfer learning , 2020, 2020 IEEE International Symposium on Information Theory (ISIT).
[5] Daniel M. Roy,et al. Sharpened Generalization Bounds based on Conditional Mutual Information and an Application to Noisy, Iterative Algorithms , 2020, NeurIPS.
[6] James Zou,et al. Controlling Bias in Adaptive Data Analysis Using Information Theory , 2015, AISTATS.
[7] Gábor Lugosi,et al. Concentration Inequalities - A Nonasymptotic Theory of Independence , 2013, Concentration Inequalities.
[8] Maxim Raginsky,et al. Information-theoretic analysis of generalization capability of learning algorithms , 2017, NIPS.
[9] Shai Ben-David,et al. Understanding Machine Learning: From Theory to Algorithms , 2014 .
[10] Osvaldo Simeone,et al. Information-Theoretic Generalization Bounds for Meta-Learning and Applications , 2020, Entropy.
[11] Shaofeng Zou,et al. Tightening Mutual Information Based Bounds on Generalization Error , 2019, 2019 IEEE International Symposium on Information Theory (ISIT).
[12] Thomas Steinke,et al. Reasoning About Generalization via Conditional Mutual Information , 2020, COLT.
[13] Sergio Verdú,et al. Chaining Mutual Information and Tightening Generalization Bounds , 2018, NeurIPS.
[14] Mikael Skoglund,et al. On Random Subset Generalization Error Bounds and the Stochastic Gradient Langevin Dynamics Algorithm , 2020, 2020 IEEE Information Theory Workshop (ITW).