暂无分享,去创建一个
[1] Mark D. Reid,et al. Information, Divergence and Risk for Binary Experiments , 2009, J. Mach. Learn. Res..
[2] Igal Sason. On the Rényi divergence and the joint range of relative entropies , 2015, 2015 IEEE International Symposium on Information Theory (ISIT).
[3] Vincent Yan Fu Tan,et al. A Tight Upper Bound for the Third-Order Asymptotics for Most Discrete Memoryless Channels , 2012, IEEE Transactions on Information Theory.
[4] Sever S Dragomir,et al. Bounds for the normalised Jensen functional , 2006, Bulletin of the Australian Mathematical Society.
[5] Sergio Verdú,et al. Total variation distance and the distribution of relative information , 2014, 2014 Information Theory and Applications Workshop (ITA).
[6] Daniel Berend,et al. Minimum KL-Divergence on Complements of $L_{1}$ Balls , 2012, IEEE Transactions on Information Theory.
[7] Gustavo L. Gilardoni. On the minimum f-divergence for given total variation , 2006 .
[8] Gustavo L. Gilardoni. On Pinsker's and Vajda's Type Inequalities for Csiszár's $f$ -Divergences , 2006, IEEE Transactions on Information Theory.
[9] Sergio Verdú,et al. Simulation of random processes and rate-distortion theory , 1996, IEEE Trans. Inf. Theory.
[10] Zhengmin Zhang,et al. Estimating Mutual Information Via Kolmogorov Distance , 2007, IEEE Transactions on Information Theory.
[11] Peter Harremoës,et al. Rényi Divergence and Kullback-Leibler Divergence , 2012, IEEE Transactions on Information Theory.
[12] Edward C. van der Meulen,et al. Mutual information, variation, and Fano’s inequality , 2008, Probl. Inf. Transm..
[13] Thomas M. Cover,et al. Elements of Information Theory , 2005 .
[14] Peter Harremoës,et al. Refinements of Pinsker's inequality , 2003, IEEE Trans. Inf. Theory.
[15] Erik Ordentlich,et al. A distribution dependent refinement of Pinsker's inequality , 2005, IEEE Transactions on Information Theory.
[16] Raymond W. Yeung,et al. The Interplay Between Entropy and Variational Distance , 2010, IEEE Trans. Inf. Theory.
[17] Thomas M. Cover,et al. Elements of Information Theory: Cover/Elements of Information Theory, Second Edition , 2005 .
[18] Sergio Verdú,et al. Channels With Cost Constraints: Strong Converse and Dispersion , 2014, IEEE Transactions on Information Theory.
[19] Bernhard C. Geiger,et al. Optimal Quantization for Distribution Synthesis , 2013, IEEE Transactions on Information Theory.
[20] Adityanand Guntuboyina,et al. Sharp Inequalities for $f$ -Divergences , 2014, IEEE Trans. Inf. Theory.
[21] Igal Sason. Tight bounds for symmetric divergence measures and a new inequality relating f-divergences , 2015, 2015 IEEE Information Theory Workshop (ITW).
[22] Igal Sason,et al. Entropy Bounds for Discrete Random Variables via Maximal Coupling , 2012, IEEE Transactions on Information Theory.
[23] T. Kailath. The Divergence and Bhattacharyya Distance Measures in Signal Selection , 1967 .
[24] Igor Vajda,et al. Note on discrimination information and variation (Corresp.) , 1970, IEEE Trans. Inf. Theory.
[25] Vyacheslav V. Prelov,et al. On inequalities between mutual information and variation , 2007, Probl. Inf. Transm..
[26] Raymond W. Yeung,et al. The Interplay Between Entropy and Variational Distance , 2007, IEEE Transactions on Information Theory.
[27] Imre Csiszár,et al. Context tree estimation for not necessarily finite memory processes, via BIC and MDL , 2005, IEEE Transactions on Information Theory.