Good codes can be produced by a few permutations

Our main result is that good codes, even those meeting the random coding bound, can be produced with relatively few (linear in the block length) permutations from a single codeword. This cutdown in complexity may be of practical importance. The motivation for looking at such codes came from Ahlswede's covering lemma, which makes it possible to build correlated source codes from channel codes via permutations. In Appendix I we show that the problem of finding the best error exponents for coding sources with full side information at the decoder, which has received attention in the recent literature, can easily be reduced to the familiar one for the discrete memoryless channel (DMC). Finally, in Appendices II and III we give rather precise double exponentially small bounds on the probabilities that a randomly chosen code will fail to meet the random coding or expurgated bound for the DMC. According to these results, good codes are hard to miss if selected at random. This also explains why good codes of a Iow complexity (such as those produced by

[1]  Amiel Feinstein,et al.  A new basic theorem of information theory , 1954, Trans. IRE Prof. Group Inf. Theory.

[2]  Rudolf Ahlswede,et al.  Channel capacities for list codes , 1973, Journal of Applied Probability.

[3]  Gunter Dueck,et al.  Reliability function of a discrete memoryless channel at rates above capacity (Corresp.) , 1979, IEEE Trans. Inf. Theory.

[4]  Claude E. Shannon,et al.  Certain Results in Coding Theory for Noisy Channels , 1957, Inf. Control..

[5]  Rudolf Ahlswede,et al.  A method of coding and its application to arbitrarily varying channels , 1980 .

[6]  Suguru Arimoto,et al.  On the converse to the coding theorem for discrete memoryless channels (Corresp.) , 1973, IEEE Trans. Inf. Theory.

[7]  Richard E. Blahut,et al.  Hypothesis testing and information theory , 1974, IEEE Trans. Inf. Theory.

[8]  R. A. McDonald,et al.  Noiseless Coding of Correlated Information Sources , 1973 .

[9]  R. Gallager Information Theory and Reliable Communication , 1968 .

[10]  J. Wolfowitz The coding of messages subject to chance errors , 1957 .

[11]  Elwyn R. Berlekamp,et al.  Lower Bounds to Error Probability for Coding on Discrete Memoryless Channels. II , 1967, Inf. Control..

[12]  R. Ahlswede Elimination of correlation in random codes for arbitrarily varying channels , 1978 .

[13]  Robert G. Gallager,et al.  A simple derivation of the coding theorem and some applications , 1965, IEEE Trans. Inf. Theory.

[14]  Imre Csiszár,et al.  Graph decomposition: A new key to coding theorems , 1981, IEEE Trans. Inf. Theory.

[15]  Jim K. Omura A Lower Bounding Method for Channel and Source Coding Probabilities , 1975, Inf. Control..

[16]  Richard E. Blahut,et al.  Composition bounds for channel block codes , 1977, IEEE Trans. Inf. Theory.

[17]  Rudolf Ahlswede,et al.  Coloring hypergraphs: A new approach to multi-user source coding, 1 , 1979 .