Information-theoretic applications of the logarithmic probability comparison bound

A well-known technique in estimating the probabilities of rare events in general and in information theory in particular (used, for example, in the sphere-packing bound) is that of finding a reference probability measure under which the event of interest has the probability of order one and estimating the probability in question by means of the Kullback-Leibler divergence. A method has recently been proposed in [2] that can be viewed as an extension of this idea in which the probability under the reference measure may itself be decaying exponentially, and the Rényi divergence is used instead. The purpose of this paper is to demonstrate the usefulness of this approach in various information-theoretic settings. For the problem of channel coding, we provide a general methodology for obtaining matched, mismatched, and robust error exponent bounds, as well as new results in a variety of particular channel models. Other applications we address include rate-distortion coding and the problem of guessing.

[1]  Amir Dembo,et al.  Large Deviations Techniques and Applications , 1998 .

[2]  Peter Harremoës,et al.  Rényi divergence and majorization , 2010, 2010 IEEE International Symposium on Information Theory.

[3]  V.W.S. Chan,et al.  Principles of Digital Communication and Coding , 1979 .

[4]  Neri Merhav,et al.  Information-Theoretic Applications of the Logarithmic Probability Comparison Bound , 2015, IEEE Trans. Inf. Theory.

[5]  A. Dembo,et al.  Large Deviations for Quadratic Functionals of Gaussian Processes , 1993 .

[6]  Elwyn R. Berlekamp,et al.  Lower Bounds to Error Probability for Coding on Discrete Memoryless Channels. II , 1967, Inf. Control..

[7]  E. Todorov,et al.  A UNIFIED THEORY OF LINEARLY SOLVABLE OPTIMAL CONTROL , 2012 .

[8]  I. Vajda Distances and discrimination rates for stochastic processes , 1990 .

[9]  Xiongzhi Chen Brownian Motion and Stochastic Calculus , 2008 .

[10]  Neri Merhav,et al.  Guessing Subject to Distortion , 1998, IEEE Trans. Inf. Theory.

[11]  Peter Harremoës,et al.  Rényi Divergence and Kullback-Leibler Divergence , 2012, IEEE Transactions on Information Theory.

[12]  Robert M. Gray,et al.  Toeplitz and Circulant Matrices: A Review , 2005, Found. Trends Commun. Inf. Theory.

[13]  Katalin Marton,et al.  Error exponent for source coding with a fidelity criterion , 1974, IEEE Trans. Inf. Theory.

[14]  Paul Dupuis,et al.  Robust Bounds on Risk-Sensitive Functionals via Rényi Divergence , 2013, SIAM/ASA J. Uncertain. Quantification.

[15]  H. Vincent Poor,et al.  Channel Coding Rate in the Finite Blocklength Regime , 2010, IEEE Transactions on Information Theory.

[16]  D. A. Bell,et al.  Information Theory and Reliable Communication , 1969 .

[17]  Neri Merhav,et al.  On Zero-Rate Error Exponents of Finite-State Channels With Input-Dependent States , 2014, IEEE Transactions on Information Theory.

[18]  Imre Csiszár,et al.  Information Theory - Coding Theorems for Discrete Memoryless Systems, Second Edition , 2011 .

[19]  J. Lynch,et al.  A weak convergence approach to the theory of large deviations , 1997 .

[20]  N. Sloane,et al.  Lower Bounds to Error Probability for Coding on Discrete Memoryless Channels. I , 1993 .

[21]  Gholamhossein Yari,et al.  Some properties of Rényi entropy and Rényi entropy rate , 2009, Inf. Sci..