List decoding for noisy channels

Shannon's fundamental coding theorem for noisy channels states that such a channel has a capacity C, and that for any transmission rate R less than C it is possible for the receiver to use a received sequence of n symbols to select one of the 2 n R possible transmitted sequences, with an error probability Pe which can be made arbitrarily small by increasing n, keeping R and C fixed. Recently upper and lower bounds have been found for the best obtainable Pe as a function of C,R and n. This paper investigates this relationship for a modified decoding procedure , in which the receiver lists L messages, rather than one, after reception. In this case for given C and R, it is possible to choose L large enough so that the ratio of upper and lower bounds to the error probability is arbitrarily near to 1 for all large n. This implies that for large L, the average of all codes is almost as good as the best code, and in fact that almost all codes are almost as good as the best code.