Pseudo-random recursive convolutional coding for near-capacity performance

We consider recursive convolutional coding as a means for constructing codes whose distance distribution is close to that obtained in the average by random coding, hence whose performance is expected to closely approach the channel capacity. We especially consider convolutional codes where the encoder register taps are such that it generates maximum-length sequences. Two algorithms for decoding these codes are discussed. Since both involve implementation difficulties, we propose to generate such codes by means similar to turbo-codes which make their decoding easy.<<ETX>>