List decoding for noisy channels
暂无分享,去创建一个
Shannon's fundamental coding theorem for noisy channels states that such a channel has a capacity C, and that for any transmission rate R less than C it is possible for the receiver to use a received sequence of n symbols to select one of the 2 n R possible transmitted sequences, with an error probability Pe which can be made arbitrarily small by increasing n, keeping R and C fixed. Recently upper and lower bounds have been found for the best obtainable Pe as a function of C,R and n. This paper investigates this relationship for a modified decoding procedure , in which the receiver lists L messages, rather than one, after reception. In this case for given C and R, it is possible to choose L large enough so that the ratio of upper and lower bounds to the error probability is arbitrarily near to 1 for all large n. This implies that for large L, the average of all codes is almost as good as the best code, and in fact that almost all codes are almost as good as the best code.
[1] C. E. SHANNON,et al. A mathematical theory of communication , 1948, MOCO.
[2] H. Chernoff. A Measure of Asymptotic Efficiency for Tests of a Hypothesis Based on the sum of Observations , 1952 .
[3] D. Slepian. A class of binary signaling alphabets , 1956 .
[4] John M. Wozencraft,et al. Sequential decoding for reliable communication , 1957 .