A Simple Proof of the Entropy-Power Inequality via Properties of Mutual Information
暂无分享,去创建一个
[1] Shlomo Shamai,et al. Proof of Entropy Power Inequalities Via MMSE , 2006, 2006 IEEE International Symposium on Information Theory.
[2] Shlomo Shamai,et al. The Capacity Region of the Gaussian Multiple-Input Multiple-Output Broadcast Channel , 2006, IEEE Transactions on Information Theory.
[3] Nelson M. Blachman,et al. The convolution inequality for entropy powers , 1965, IEEE Trans. Inf. Theory.
[4] Shlomo Shamai,et al. Mutual information and minimum mean-square error in Gaussian channels , 2004, IEEE Transactions on Information Theory.
[5] Amir Dembo,et al. Information theoretic inequalities , 1991, IEEE Trans. Inf. Theory.
[6] Thomas M. Cover,et al. Elements of Information Theory , 2005 .
[7] Jean-François Bercher,et al. Estimating the entropy of a signal with applications , 2000, IEEE Trans. Signal Process..
[8] Sergio Verdú,et al. A simple proof of the entropy-power inequality , 2006, IEEE Transactions on Information Theory.
[9] A. Barron. ENTROPY AND THE CENTRAL LIMIT THEOREM , 1986 .
[10] C. E. SHANNON,et al. A mathematical theory of communication , 1948, MOCO.
[11] Jean-François Bercher,et al. Estimating the entropy of a signal with applications , 1999, 1999 IEEE International Conference on Acoustics, Speech, and Signal Processing. Proceedings. ICASSP99 (Cat. No.99CH36258).
[12] Ram Zamir,et al. A Proof of the Fisher Information Inequality via a Data Processing Argument , 1998, IEEE Trans. Inf. Theory.
[13] A. J. Stam. Some Inequalities Satisfied by the Quantities of Information of Fisher and Shannon , 1959, Inf. Control..