A lower bound to the rate-distortion function R(D) of finite-alphabet sources with memory is derived for the class of balanced distortion measures. For finite-state finite-alphabet Markov sources, sufficient conditions are given for the existence of a strictly positive average distortion D_c such that R(D) equals its lower bound for 0 \leqq D \leqq D_c . The bound is evaluated for the Hamming and Lee distortion measures and is identical to the corresponding bound for memoryless sources having the same entropy and alphabet. These results are applied to yield a simple proof of the converse of the noisy-channel coding theorem for sources satisfying the sufficient conditions for equality with the lower bound and channels with memory. D_c is evaluated explicitly for the special case of the binary asymmetric Markov source.
[1]
R. Gallager.
Information Theory and Reliable Communication
,
1968
.
[2]
R. Dobrushin.
Mathematical Problems in the Shannon Theory of Optimal Coding of Information
,
1961
.
[3]
John T. Pinkston.
An application of rate-distortion theory to a converse to the coding theorem
,
1969,
IEEE Trans. Inf. Theory.
[4]
Richard Bellman,et al.
Introduction to Matrix Analysis
,
1972
.
[5]
Aaron D. Wyner,et al.
Bounds on the rate-distortion function for stationary sources with memory
,
1971,
IEEE Trans. Inf. Theory.
[6]
Robert M. Gray,et al.
Information rates of autoregressive processes
,
1970,
IEEE Trans. Inf. Theory.