The relation between the entropy of a discrete random variable and the minimum attainable probability of error made in guessing its value is examined. While Fano's inequality provides a tight lower bound on the error probability in terms of the entropy, the present authors derive a converse result/spl mdash/a tight upper bound on the minimal error probability in terms of the entropy. Both bounds are sharp, and can draw a relation, as well, between the error probability for the maximum a posteriori (MAP) rule, and the conditional entropy (equivocation), which is a useful uncertainty measure in several applications. Combining this relation and the classical channel coding theorem, the authors present a channel coding theorem for the equivocation which, unlike the channel coding theorem for error probability, is meaningful at all rates. This theorem is proved directly for DMCs, and from this proof it is further concluded that for R/spl ges/C the equivocation achieves its minimal value of R/spl minus/C at the rate of n/sup 1/spl sol/2/ where n is the block length. >
[1]
Thomas M. Cover,et al.
Elements of Information Theory
,
2005
.
[2]
J. T. Chu,et al.
Inequalities between information measures and error probability
,
1966
.
[3]
Toby Berger,et al.
Rate distortion theory : a mathematical basis for data compression
,
1971
.
[4]
Neri Merhav,et al.
Universal prediction of individual sequences
,
1992,
IEEE Trans. Inf. Theory.
[5]
Thomas M. Cover,et al.
Behavior of sequential predictors of binary sequences
,
1965
.
[6]
John L. Kelly,et al.
A new interpretation of information rate
,
1956,
IRE Trans. Inf. Theory.
[7]
Hirosuke Yamamoto,et al.
Information theory in cryptology
,
1991
.
[8]
Martin E. Hellman,et al.
Probability of error, equivocation, and the Chernoff bound
,
1970,
IEEE Trans. Inf. Theory.
[9]
Robert G. Gallager,et al.
A simple derivation of the coding theorem and some applications
,
1965,
IEEE Trans. Inf. Theory.