An application of rate-distortion theory to a converse to the coding theorem

A lower bound to the information rate R(D) for a discrete memoryless source with a fidelity criterion is presented for the case in which the distortion matrix contains the same set of entries, perhaps permuted, in each column. A necessary and sufficient condition for R(D) to equal this bound is given. In particular, if the smallest column element is zero and occurs once in each row, then there is a range of D, 0 \leq D \leq D_{1} , in which equality holds. These results are then applied to the special case of d_{ij}= 1 - \delta_{ij} , for which the average distortion is just the probability of incorrectly reproducing the source output. We show how to construct R(D) for this case, from which one can solve for the minimum achievable probability of error when transmitting over a channel of known capacity.

[1]  Barney Reiffen A per letter converse to the channel coding theorem , 1966, IEEE Trans. Inf. Theory.

[2]  R. Gallager Information Theory and Reliable Communication , 1968 .

[3]  C. E. SHANNON,et al.  A mathematical theory of communication , 1948, MOCO.

[4]  Robert B. Ash,et al.  Information Theory , 2020, The SAGE International Encyclopedia of Mass Media and Society.