BOUNDS FOR ENTROPY AND DIVERGENCE FOR DISTRIBUTIONS OVER A TWO-ELEMENT SET
暂无分享,去创建一个
[1] Amiel Feinstein,et al. Information and information stability of random variables and processes , 1964 .
[2] N. S. Kambo,et al. On exponential bounds for binomial probabilities , 1966 .
[3] O. Krafft. A note on exponential bounds for binomial probabilities , 1969 .
[4] Igor Vajda,et al. Note on discrimination information and variation (Corresp.) , 1970, IEEE Trans. Inf. Theory.
[5] Henry C. Thacher,et al. Applied and Computational Complex Analysis. , 1988 .
[6] Godfried T. Toussaint,et al. Sharper lower bounds for discrimination information in terms of variation (Corresp.) , 1975, IEEE Trans. Inf. Theory.
[7] Jianhua Lin,et al. Divergence measures based on the Shannon entropy , 1991, IEEE Trans. Inf. Theory.
[8] Huaiyu Zhu. On Information and Sufficiency , 1997 .
[9] Flemming Topsøe,et al. Some inequalities for information divergence and related measures of discrimination , 2000, IEEE Trans. Inf. Theory.
[10] Peter Harremoës,et al. Inequalities between entropy and index of coincidence derived from information diagrams , 2001, IEEE Trans. Inf. Theory.
[11] F. Topsøe,et al. Vajda's tight lower bound and refinements of Pinsker's inequality , 2001, Proceedings. 2001 IEEE International Symposium on Information Theory (IEEE Cat. No.01CH37252).