This paper is concerned with the transmission of a discrete, independent letter information source over a discrete channel. A distortion function is defined between source output letters and decoder output letters and is used to measure the performance of the system for each transmission. The coding block length is introduced as a variable and its influence upon the minimum attainable transmission distortion is investigated. The lower bound to transmission distortion is found to converge to the distortion level d c (C is the channel capacity) algebraically as a/n. The nonnegative coefficient a is a function of both the source and channel statistics, which are interrelated in such a way as to suggest the utility of this coefficient as a measure of “mismatch” between source and channel, the larger the mismatch the slower the approach of the lower bound to the asymptote d c . For noiseless channels a = ∞ and for this case the lower bound is shown to converge to d c as a 1 (ln n)/n. For noisy channels the upper bound to transmission distortion is found to converge to the asymptote d c algebraically as b[(ln n)/n]1/2. For noiseless channels, the upper bound converges to d c as a 1 (ln n)/n.
[1]
Claude E. Shannon,et al.
A Mathematical Theory of Communications
,
1948
.
[2]
C. E. SHANNON,et al.
A mathematical theory of communication
,
1948,
MOCO.
[3]
H. Chernoff.
A Measure of Asymptotic Efficiency for Tests of a Hypothesis Based on the sum of Observations
,
1952
.
[4]
Robert G. Gallager,et al.
A simple derivation of the coding theorem and some applications
,
1965,
IEEE Trans. Inf. Theory.
[5]
Thomas J. Goblick,et al.
Theoretical limitations on the transmission of data from analog sources
,
1965,
IEEE Trans. Inf. Theory.