Information-Theoretic Applications of the Logarithmic Probability Comparison Bound

A well-known technique in estimating the probabilities of rare events in general and in information theory in particular (used, for example, in the sphere–packing bound) is that of finding a reference probability measure under which the event of interest has the probability of order one and estimating the probability in question by means of the Kullback–Leibler divergence. A method has recently been proposed in [2] that can be viewed as an extension of this idea in which the probability under the reference measure may itself be decaying exponentially, and the Renyi divergence is used instead. The purpose of this paper is to demonstrate the usefulness of this approach in various information–theoretic settings. For the problem of channel coding, we provide a general methodology for obtaining matched, mismatched, and robust error exponent bounds, as well as new results in a variety of particular channel models. Other applications we address include rate-distortion coding and the problem of guessing.

[1]  A. Dembo,et al.  Large Deviations for Quadratic Functionals of Gaussian Processes , 1993 .

[2]  R. Gallager Information Theory and Reliable Communication , 1968 .

[3]  Peter Harremoës,et al.  Rényi Divergence and Kullback-Leibler Divergence , 2012, IEEE Transactions on Information Theory.

[4]  Robert M. Gray,et al.  Toeplitz and Circulant Matrices: A Review , 2005, Found. Trends Commun. Inf. Theory.

[5]  Neri Merhav,et al.  Guessing Subject to Distortion , 1998, IEEE Trans. Inf. Theory.

[6]  Ioannis Karatzas,et al.  Brownian Motion and Stochastic Calculus , 1987 .

[7]  J. Lynch,et al.  A weak convergence approach to the theory of large deviations , 1997 .

[8]  Andrew J. Viterbi,et al.  Principles of Digital Communication and Coding , 1979 .

[9]  Katalin Marton,et al.  Error exponent for source coding with a fidelity criterion , 1974, IEEE Trans. Inf. Theory.

[10]  Paul Dupuis,et al.  Robust Bounds on Risk-Sensitive Functionals via Rényi Divergence , 2013, SIAM/ASA J. Uncertain. Quantification.

[11]  I. Vajda,et al.  Convex Statistical Distances , 2018, Statistical Inference for Engineers and Data Scientists.

[12]  Elwyn R. Berlekamp,et al.  Lower Bounds to Error Probability for Coding on Discrete Memoryless Channels. II , 1967, Inf. Control..

[13]  I. Vajda Distances and discrimination rates for stochastic processes , 1990 .

[14]  Imre Csiszár,et al.  Information Theory - Coding Theorems for Discrete Memoryless Systems, Second Edition , 2011 .

[15]  Neri Merhav,et al.  On Zero-Rate Error Exponents of Finite-State Channels With Input-Dependent States , 2014, IEEE Transactions on Information Theory.

[16]  H. Vincent Poor,et al.  Channel Coding Rate in the Finite Blocklength Regime , 2010, IEEE Transactions on Information Theory.

[17]  Peter Harremoës,et al.  Rényi divergence and majorization , 2010, 2010 IEEE International Symposium on Information Theory.

[18]  Gholamhossein Yari,et al.  Some properties of Rényi entropy and Rényi entropy rate , 2009, Inf. Sci..

[19]  Amir Dembo,et al.  Large Deviations Techniques and Applications , 1998 .