Information-Theoretic Bounds on Transfer Generalization Gap Based on Jensen-Shannon Divergence
暂无分享,去创建一个
[1] Emilio Soria Olivas,et al. Handbook of Research on Machine Learning Applications and Trends : Algorithms , Methods , and Techniques , 2009 .
[2] Frank Nielsen,et al. On a Generalization of the Jensen–Shannon Divergence and the Jensen–Shannon Centroid , 2019, Entropy.
[3] Koby Crammer,et al. A theory of learning from different domains , 2010, Machine Learning.
[4] Shaofeng Zou,et al. Tightening Mutual Information Based Bounds on Generalization Error , 2019, 2019 IEEE International Symposium on Information Theory (ISIT).
[5] Giuseppe Durisi,et al. Generalization Bounds via Information Density and Conditional Information Density , 2020, IEEE Journal on Selected Areas in Information Theory.
[6] Frank Nielsen,et al. A family of statistical symmetric divergences based on Jensen's inequality , 2010, ArXiv.
[7] Yishay Mansour,et al. Domain Adaptation: Learning Bounds and Algorithms , 2009, COLT.
[8] Lei Zhang,et al. Generalization Bounds for Domain Adaptation , 2012, NIPS.
[9] Takuya Yamano,et al. Some bounds for skewed α-Jensen-Shannon divergence , 2019, Results in Applied Mathematics.
[10] Maxim Raginsky,et al. Information-theoretic analysis of generalization capability of learning algorithms , 2017, NIPS.
[11] Qi Chen,et al. Beyond H-Divergence: Domain Adaptation Theory With Jensen-Shannon Divergence , 2020, ArXiv.
[12] F. Alajaji,et al. Lectures Notes in Information Theory , 2000 .
[13] Jonathan H. Manton,et al. Information-theoretic analysis for transfer learning , 2020, 2020 IEEE International Symposium on Information Theory (ISIT).
[14] Koby Crammer,et al. Analysis of Representations for Domain Adaptation , 2006, NIPS.
[15] Gavriel Salomon,et al. T RANSFER OF LEARNING , 1992 .