An Integral Representation of the Logarithmic Function with Applications in Information Theory

We explore a well-known integral representation of the logarithmic function, and demonstrate its usefulness in obtaining compact, easily computable exact formulas for quantities that involve expectations and higher moments of the logarithm of a positive random variable (or the logarithm of a sum of i.i.d. positive random variables). The integral representation of the logarithm is proved useful in a variety of information-theoretic applications, including universal lossless data compression, entropy and differential entropy evaluations, and the calculation of the ergodic capacity of the single-input, multiple-output (SIMO) Gaussian channel with random parameters (known to both transmitter and receiver). This integral representation and its variants are anticipated to serve as a useful tool in additional applications, as a rigorous alternative to the popular (but non-rigorous) replica method (at least in some situations).

[1]  Victoria Kostina,et al.  A Lower Bound on the Differential Entropy of Log-Concave Random Vectors with Applications , 2017, Entropy.

[2]  Jorma Rissanen,et al.  Universal coding, information, prediction, and estimation , 1984, IEEE Trans. Inf. Theory.

[3]  Michael B. Pursley,et al.  Efficient universal noiseless source codes , 1981, IEEE Trans. Inf. Theory.

[4]  Jorma Rissanen,et al.  The Minimum Description Length Principle in Coding and Modeling , 1998, IEEE Trans. Inf. Theory.

[5]  D. F. Hays,et al.  Table of Integrals, Series, and Products , 1966 .

[6]  Anselm Blumer Minimax universal noiseless coding for unifilar and Markov sources , 1987, IEEE Trans. Inf. Theory.

[7]  C. Appledorn The Entropy of a Poisson Distribution , 1987 .

[8]  Andrew R. Barron,et al.  Asymptotic minimax regret for data compression, gambling, and prediction , 1997, IEEE Trans. Inf. Theory.

[9]  Dongfeng Yuan,et al.  Logarithmic Expectation of the Sum of Exponential Random Variables for Wireless Communication Performance Evaluation , 2015, 2015 IEEE 82nd Vehicular Technology Conference (VTC2015-Fall).

[10]  Andrew R. Barron,et al.  Information-theoretic asymptotics of Bayes methods , 1990, IEEE Trans. Inf. Theory.

[11]  David Tse,et al.  Fundamentals of Wireless Communication , 2005 .

[12]  M. Marsili,et al.  Optimal work extraction and mutual information in a generalized Szilárd engine. , 2019, Physical review. E.

[13]  Charles Knessl,et al.  Integral representations and asymptotic expansions for Shannon and Renyi entropies , 1998 .

[14]  JORMA RISSANEN,et al.  A universal data compression system , 1983, IEEE Trans. Inf. Theory.

[15]  Dariush Divsalar,et al.  Some new twists to problems involving the Gaussian probability integral , 1998, IEEE Trans. Commun..

[16]  Jorma Rissanen,et al.  Fisher information and stochastic complexity , 1996, IEEE Trans. Inf. Theory.

[17]  Raphail E. Krichevsky,et al.  The performance of universal encoding , 1981, IEEE Trans. Inf. Theory.

[18]  A. Wald Tests of statistical hypotheses concerning several parameters when the number of observations is large , 1943 .

[19]  J. Craig A new, simple and exact result for calculating the probability of error for two-dimensional signal constellations , 1991, MILCOM 91 - Conference record.

[20]  Lee D. Davisson,et al.  Minimax noiseless universal coding for Markov sources , 1983, IEEE Trans. Inf. Theory.

[21]  Lee D. Davisson,et al.  Universal noiseless coding , 1973, IEEE Trans. Inf. Theory.

[22]  S. Kak Information, physics, and computation , 1996 .

[23]  Meir Feder,et al.  A universal finite memory source , 1995, IEEE Trans. Inf. Theory.

[24]  Tsachy Weissman,et al.  Concentration Inequalities for the Empirical Distribution , 2018, Information and Inference: A Journal of the IMA.

[25]  Newman,et al.  Interface growth and Burgers turbulence: The problem of random initial conditions. , 1993, Physical review. E, Statistical physics, plasmas, fluids, and related interdisciplinary topics.

[26]  Cihan Tepedelenlioglu,et al.  Stochastic Ordering of Fading Channels Through the Shannon Transform , 2013, IEEE Transactions on Information Theory.

[27]  Neri Merhav,et al.  Universal Prediction , 1998, IEEE Trans. Inf. Theory.

[28]  J. Crank Tables of Integrals , 1962 .

[29]  A. Barron,et al.  Jeffreys' prior is asymptotically least favorable under entropy risk , 1994 .

[30]  Marvin K. Simon,et al.  A new twist on the Marcum Q-function and its application , 1998, IEEE Communications Letters.

[31]  A. Erdélyi,et al.  Higher Transcendental Functions , 1954 .

[32]  James S. Harris,et al.  Tables of integrals , 1998 .

[33]  T. MacRobert Higher Transcendental Functions , 1955, Nature.

[34]  Alfonso Martinez,et al.  Spectral efficiency of optical direct detection , 2007 .