Information-theoretic analysis for transfer learning
暂无分享,去创建一个
Jonathan H. Manton | Uwe Aickelin | Jingge Zhu | Xuetong Wu | J. Manton | U. Aickelin | Jingge Zhu | Xuetong Wu
[1] Mark W. Schmidt,et al. Convergence Rates of Inexact Proximal-Gradient Methods for Convex Optimization , 2011, NIPS.
[2] Koby Crammer,et al. A theory of learning from different domains , 2010, Machine Learning.
[3] Philip S. Yu,et al. Adaptation Regularization: A General Framework for Transfer Learning , 2014, IEEE Transactions on Knowledge and Data Engineering.
[4] Gert R. G. Lanckriet,et al. On the empirical estimation of integral probability metrics , 2012 .
[5] André Elisseeff,et al. Stability and Generalization , 2002, J. Mach. Learn. Res..
[6] James Zou,et al. Controlling Bias in Adaptive Data Analysis Using Information Theory , 2015, AISTATS.
[7] Maxim Raginsky,et al. Information-theoretic analysis of generalization capability of learning algorithms , 2017, NIPS.
[8] R. Moddemeijer. On estimation of entropy and mutual information of continuous distributions , 1989 .
[9] Yanjun Han,et al. Dependence measures bounding the exploration bias for general measurements , 2016, 2017 IEEE International Symposium on Information Theory (ISIT).
[10] Koby Crammer,et al. Learning Bounds for Domain Adaptation , 2007, NIPS.
[11] Lei Zhang,et al. Generalization Bounds for Domain Adaptation , 2012, NIPS.
[12] Yuchen Zhang,et al. Bridging Theory and Algorithm for Domain Adaptation , 2019, ICML.
[13] Shaofeng Zou,et al. Tightening Mutual Information Based Bounds on Generalization Error , 2019, 2019 IEEE International Symposium on Information Theory (ISIT).
[14] Qiang Yang,et al. Boosting for transfer learning , 2007, ICML '07.
[15] Sergio Verdú,et al. Chaining Mutual Information and Tightening Generalization Bounds , 2018, NeurIPS.