Sharp Moment-Entropy Inequalities and Capacity Bounds for Symmetric Log-Concave Distributions

We show that the uniform distribution minimizes entropy among all one-dimensional symmetric log-concave distributions with fixed variance, as well as various generalizations of this fact to Rényi entropies of orders less than 1 and with moment constraints involving <inline-formula> <tex-math notation="LaTeX">$p$ </tex-math></inline-formula>-th absolute moments with <inline-formula> <tex-math notation="LaTeX">$p\leq 2$ </tex-math></inline-formula>. As consequences, we give new capacity bounds for additive noise channels with symmetric log-concave noises, as well as for timing channels involving positive signal and noise where the noise has a decreasing log-concave density. In particular, we show that the capacity of an additive noise channel with symmetric, log-concave noise under an average power constraint is at most 0.254 bits per channel use greater than the capacity of an additive Gaussian noise channel with the same noise power. Consequences for reverse entropy power inequalities and connections to the slicing problem in convex geometry are also discussed.

[1]  Mokshay Madiman,et al.  Combinatorial Entropy Power Inequalities: A Preliminary Study of the Stam Region , 2017, IEEE Transactions on Information Theory.

[2]  Thomas M. Cover,et al.  Elements of Information Theory , 2005 .

[3]  Sergey G. Bobkov,et al.  Entropy Power Inequality for the Rényi Entropy , 2015, IEEE Transactions on Information Theory.

[4]  Shraga I. Bross,et al.  On the reliability exponent of the additive exponential noise channel , 2006, Probl. Inf. Transm..

[5]  Dipak Ghosal,et al.  A Survey of Timing Channels and Countermeasures , 2017, ACM Comput. Surv..

[6]  A. J. Stam Some Inequalities Satisfied by the Quantities of Information of Fisher and Shannon , 1959, Inf. Control..

[7]  Alfred O. Hero,et al.  Bounds on Variance for Unimodal Distributions , 2015, IEEE Transactions on Information Theory.

[8]  Peng Xu,et al.  Forward and Reverse Entropy Power Inequalities in Convex Geometry , 2016, ArXiv.

[9]  Sergey G. Bobkov,et al.  On the problem of reversibility of the entropy power inequality , 2011, ArXiv.

[10]  M. Madiman,et al.  Rogozin's convolution inequality for locally compact groups , 2017, 1705.00642.

[11]  J. Bourgain On the distribution of polynomials on high dimensional convex sets , 1991 .

[12]  Silouanos Brazitikos Geometry of Isotropic Convex Bodies , 2014 .

[13]  Amir Dembo,et al.  Information theoretic inequalities , 1991, IEEE Trans. Inf. Theory.

[14]  Piotr Nayar,et al.  A reverse entropy power inequality for log-concave random vectors , 2015, 1509.05926.

[15]  P. Reynaud-Bouret,et al.  High Dimensional Probability VII : The Cargèse Volume , 2016 .

[16]  K. Ball Logarithmically concave functions and sections of convex sets in $R^{n}$ , 1988 .

[17]  Ludwig Boltzmann,et al.  Lectures on Gas Theory , 1964 .

[18]  V. Milman,et al.  Isotropic position and inertia ellipsoids and zonoids of the unit ball of a normed n-dimensional space , 1989 .

[19]  Mokshay M. Madiman,et al.  Generalized Entropy Power Inequalities and Monotonicity Properties of Information , 2006, IEEE Transactions on Information Theory.

[20]  Zhen Zhang,et al.  On the maximum entropy of the sum of two dependent random variables , 1994, IEEE Trans. Inf. Theory.

[21]  Liyao Wang,et al.  Optimal Concentration of Information Content For Log-Concave Densities , 2015, ArXiv.

[22]  S. Bobkov,et al.  Reverse Brunn–Minkowski and reverse entropy power inequalities for convex measures , 2011, 1109.5287.

[23]  Jihad Fahs,et al.  On Properties of the Support of Capacity-Achieving Distributions for Additive Noise Channel Models With Input Cost Constraints , 2016, IEEE Transactions on Information Theory.

[24]  Mokshay M. Madiman,et al.  The entropies of the sum and the difference of two IID random variables are not too different , 2010, 2010 IEEE International Symposium on Information Theory.

[25]  Sergio Verdú,et al.  Bits through queues , 1994, Proceedings of 1994 IEEE International Symposium on Information Theory.

[26]  Santosh S. Vempala,et al.  Eldan's Stochastic Localization and the KLS Hyperplane Conjecture: An Improved Lower Bound for Expansion , 2016, 2017 IEEE 58th Annual Symposium on Foundations of Computer Science (FOCS).

[27]  G. Paouris On the isotropic constant of Non-symmetric convex bodies , 2000 .

[28]  James Melbourne,et al.  Further Investigations of Rényi Entropy Power Inequalities and an Entropic Characterization of s-Concave Densities , 2019, Lecture Notes in Mathematics.

[29]  Olivier Rioul,et al.  On Shannon's Formula and Hartley's Rule: Beyond the Mathematical Coincidence , 2014, Entropy.

[30]  Sergey G. Bobkov,et al.  Dimensional behaviour of entropy and information , 2011, ArXiv.

[31]  S. Dar Remarks on Bourgain’s Problem on Slicing of Convex Bodies , 1995 .

[32]  Shunsuke Ihara,et al.  On the Capacity of Channels with Additive Non-Gaussian Noise , 1978, Inf. Control..

[33]  Jean Bourgain,et al.  ON HIGH DIMENSIONAL MAXIMAL FUNCTIONS ASSOCIATED TO CONVEX BODIES , 1986 .

[34]  Vyacheslav Kungurtsev,et al.  Capacity sensitivity in additive non-Gaussian noise channels , 2017, 2017 IEEE International Symposium on Information Theory (ISIT).

[35]  Kwang-Ki K. Kim Optimization and Convexity of log det(I+KX−1) , 2019, International Journal of Control, Automation and Systems.

[36]  S. Bobkov,et al.  Concentration of the information in data with log-concave distributions , 2010, 1012.5457.

[37]  Mokshay Madiman,et al.  The Monotonicity of Information in the Central Limit Theorem and Entropy Power Inequalities , 2006, 2006 IEEE International Symposium on Information Theory.

[38]  Sergey G. Bobkov,et al.  The Entropy Per Coordinate of a Random Vector is Highly Constrained Under Convexity Conditions , 2010, IEEE Transactions on Information Theory.

[39]  Entropy methods in asymptotic convex geometry , 1999 .

[40]  Victoria Kostina,et al.  A Lower Bound on the Differential Entropy of Log-Concave Random Vectors with Applications , 2017, Entropy.

[41]  A generalized localization theorem and geometric inequalities for convex bodies , 2006 .

[42]  Mokshay Madiman,et al.  On the question of the best additive noise among symmetric log-concave noises , 2019, 2019 IEEE International Symposium on Information Theory (ISIT).

[43]  Alexandros Eskenazis,et al.  Sharp comparison of moments and the log-concave moment problem , 2018, Advances in Mathematics.

[44]  Mokshay M. Madiman,et al.  Sumset and Inverse Sumset Inequalities for Differential Entropy and Mutual Information , 2012, IEEE Transactions on Information Theory.

[45]  Matthieu Fradelizi,et al.  Concentration of information content for convex measures , 2015, Electronic Journal of Probability.

[46]  Jiange Li,et al.  Rényi entropy power inequality and a reverse , 2017, ArXiv.

[47]  Jose A. Costa,et al.  On Solutions to Multivariate Maximum α-Entropy Problems , 2003 .

[48]  Alfred O. Hero,et al.  On Solutions to Multivariate Maximum alpha-Entropy Problems , 2003, EMMCVPR.

[49]  V. V. Buldygin,et al.  Brunn-Minkowski inequality , 2000 .

[50]  M. Madiman,et al.  The convexification effect of Minkowski summation , 2017, EMS Surveys in Mathematical Sciences.

[51]  Mokshay Madiman,et al.  Two remarks on generalized entropy power inequalities , 2019, Lecture Notes in Mathematics.

[52]  Erwin Lutwak,et al.  Affine Moments of a Random Vector , 2013, IEEE Transactions on Information Theory.

[53]  D. Hensley Slicing convex bodies—bounds for slice area in terms of the body’s covariance , 1980 .

[54]  Miklós Simonovits,et al.  Random Walks in a Convex Body and an Improved Volume Algorithm , 1993, Random Struct. Algorithms.

[55]  Sang Joon Kim,et al.  A Mathematical Theory of Communication , 2006 .

[56]  James Melbourne,et al.  On the Entropy Power Inequality for the Rényi Entropy of Order [0, 1] , 2017, IEEE Transactions on Information Theory.

[57]  B. Klartag On convex perturbations with a bounded isotropic constant , 2006 .

[58]  Venkat Anantharam,et al.  Zero-rate reliability of the exponential-server timing channel , 2005, IEEE Transactions on Information Theory.

[59]  Mokshay Madiman,et al.  Entropy Bounds on Abelian Groups and the Ruzsa Divergence , 2015, IEEE Transactions on Information Theory.

[60]  Sergio Verdu,et al.  The exponential distribution in information theory , 1996 .

[61]  T. Cover,et al.  IEEE TRANSACTIONSON INFORMATIONTHEORY,VOL. IT-30,N0. 6,NOVEmER1984 Correspondence On the Similarity of the Entropy Power Inequality The preceeding equations allow the entropy power inequality and the Brunn-Minkowski Inequality to be rewritten in the equiv , 2022 .