Exact Expressions in Source and Channel Coding Problems Using Integral Representations

We explore known integral representations of the logarithmic and power functions, and demonstrate their usefulness for information-theoretic analyses. We obtain compact, easily–computable exact formulas for several source and channel coding problems that involve expectations and higher moments of the logarithm of a positive random variable and the moment of order ρ>0 of a non-negative random variable (or the sum of i.i.d. positive random variables). These integral representations are used in a variety of applications, including the calculation of the degradation in mutual information between the channel input and output as a result of jamming, universal lossless data compression, Shannon and Rényi entropy evaluations, and the ergodic capacity evaluation of the single-input, multiple–output (SIMO) Gaussian channel with random parameters (known to both transmitter and receiver). The integral representation of the logarithmic function and its variants are anticipated to serve as a rigorous alternative to the popular (but non–rigorous) replica method (at least in some situations).