On Rényi entropy power inequalities
暂无分享,去创建一个
Igal Sason | Eshed Ram | I. Sason | Eshed Ram
[1] Tetsunao Matsuta,et al. 国際会議開催報告:2013 IEEE International Symposium on Information Theory , 2013 .
[2] Naresh Sharma,et al. Entropy power inequality for a family of discrete random variables , 2011, 2011 IEEE International Symposium on Information Theory Proceedings.
[3] Shlomo Shamai,et al. The Interplay Between Information and Estimation Measures , 2013, Found. Trends Signal Process..
[4] Sang Joon Kim,et al. A Mathematical Theory of Communication , 2006 .
[5] Mokshay Madiman,et al. On the entropy of sums , 2008, 2008 IEEE Information Theory Workshop.
[6] Christophe Vignat,et al. AN ENTROPY POWER INEQUALITY FOR THE BINOMIAL FAMILY , 2003 .
[7] Shlomo Shamai,et al. Mutual information and minimum mean-square error in Gaussian channels , 2004, IEEE Transactions on Information Theory.
[8] Oliver Johnson,et al. Monotonicity, Thinning, and Discrete Versions of the Entropy Power Inequality , 2009, IEEE Transactions on Information Theory.
[9] Jae Oh Woo,et al. A discrete entropy power inequality for uniform distributions , 2015, 2015 IEEE International Symposium on Information Theory (ISIT).
[10] S. Bobkov,et al. Bounds on the Maximum of the Density for Sums of Independent Random Variables , 2014 .
[11] A. J. Stam. Some Inequalities Satisfied by the Quantities of Information of Fisher and Shannon , 1959, Inf. Control..
[12] Thomas A. Courtade,et al. Strengthening the entropy power inequality , 2016, 2016 IEEE International Symposium on Information Theory (ISIT).
[13] Giuseppe Toscani,et al. The Concavity of Rényi Entropy Power , 2014, IEEE Transactions on Information Theory.
[14] Stephen P. Boyd,et al. Convex Optimization , 2004, Algorithms and Theory of Computation Handbook.
[15] Patrick P. Bergmans,et al. A simple converse for broadcast channels with additive white Gaussian noise (Corresp.) , 1974, IEEE Trans. Inf. Theory.
[16] Varun Jog,et al. The Entropy Power Inequality and Mrs. Gerber's Lemma for Groups of Order 2n , 2014, IEEE Trans. Inf. Theory.
[17] Shlomo Shamai,et al. A binary analog to the entropy-power inequality , 1990, IEEE Trans. Inf. Theory.
[18] Tie Liu,et al. An Extremal Inequality Motivated by Multiterminal Information Theoretic Problems , 2006, ISIT.
[19] Jean-François Bercher,et al. A Renyi entropy convolution inequality with application , 2002, 2002 11th European Signal Processing Conference.
[20] Sergey G. Bobkov,et al. On the problem of reversibility of the entropy power inequality , 2011, ArXiv.
[21] E. Lieb,et al. Best Constants in Young's Inequality, Its Converse, and Its Generalization to More than Three Functions , 1976 .
[22] Amir Dembo,et al. Information theoretic inequalities , 1991, IEEE Trans. Inf. Theory.
[23] A. Rényi. On Measures of Entropy and Information , 1961 .
[24] Liyao Wang,et al. Beyond the Entropy Power Inequality, via Rearrangements , 2013, IEEE Transactions on Information Theory.
[25] Mokshay Madiman,et al. Entropy Bounds on Abelian Groups and the Ruzsa Divergence , 2015, IEEE Transactions on Information Theory.
[26] O. Johnson. Information Theory And The Central Limit Theorem , 2004 .
[27] Emre Telatar,et al. A New Entropy Power Inequality for Integer-Valued Random Variables , 2014, IEEE Trans. Inf. Theory.
[28] Giuseppe Toscani,et al. A Strengthened Entropy Power Inequality for Log-Concave Densities , 2014, IEEE Transactions on Information Theory.
[29] Shlomo Shamai,et al. Proof of Entropy Power Inequalities Via MMSE , 2006, 2006 IEEE International Symposium on Information Theory.
[30] Ofer Shayevitz,et al. On Rényi measures and hypothesis testing , 2011, 2011 IEEE International Symposium on Information Theory Proceedings.
[31] Martin E. Hellman,et al. The Gaussian wire-tap channel , 1978, IEEE Trans. Inf. Theory.
[32] Peng Xu,et al. Forward and Reverse Entropy Power Inequalities in Convex Geometry , 2016, ArXiv.
[33] Shlomo Shamai,et al. The Capacity Region of the Gaussian Multiple-Input Multiple-Output Broadcast Channel , 2006, IEEE Transactions on Information Theory.
[34] Nelson M. Blachman,et al. The convolution inequality for entropy powers , 1965, IEEE Trans. Inf. Theory.
[35] Mokshay M. Madiman,et al. Generalized Entropy Power Inequalities and Monotonicity Properties of Information , 2006, IEEE Transactions on Information Theory.
[36] Sergey G. Bobkov,et al. Entropy Power Inequality for the Rényi Entropy , 2015, IEEE Transactions on Information Theory.
[37] Igal Sason,et al. On Rényi Entropy Power Inequalities , 2016, IEEE Transactions on Information Theory.
[38] Thomas M. Cover,et al. Elements of Information Theory , 2005 .
[39] Peter Harremoës,et al. Rényi Divergence and Kullback-Leibler Divergence , 2012, IEEE Transactions on Information Theory.
[40] Sergey G. Bobkov,et al. Dimensional behaviour of entropy and information , 2011, ArXiv.
[41] Max H. M. Costa,et al. A new entropy power inequality , 1985, IEEE Trans. Inf. Theory.
[42] J. Bunch,et al. Rank-one modification of the symmetric eigenproblem , 1978 .
[43] F. Barthe. Optimal young's inequality and its converse: a simple proof , 1997, math/9704210.
[44] Serge Fehr,et al. On the Conditional Rényi Entropy , 2014, IEEE Transactions on Information Theory.
[45] C. Vignat,et al. Some results concerning maximum Renyi entropy distributions , 2005, math/0507400.
[46] William Beckner,et al. Inequalities in Fourier Analysis on Rn , 1975 .
[47] Yasutada Oohama,et al. The Rate-Distortion Function for the Quadratic Gaussian CEO Problem , 1998, IEEE Trans. Inf. Theory.
[48] Olivier Rioul,et al. Information Theoretic Proofs of Entropy Power Inequalities , 2007, IEEE Transactions on Information Theory.
[49] Jae Oh Woo,et al. A lower bound on the Rényi entropy of convolutions in the integers , 2014, 2014 IEEE International Symposium on Information Theory.
[50] Sergio Verdú,et al. A simple proof of the entropy-power inequality , 2006, IEEE Transactions on Information Theory.