Joint Source-Channel Coding and Guessing with Application to Sequential Decoding

We extend our earlier work on guessing subject to distortion to the joint source-channel coding context. We consider a system in which there is a source connected to a destination via a channel and the goal is to reconstruct the source output at the destination within a prescribed distortion level with respect to (w.r.t.) some distortion measure. The decoder is a guessing decoder in the sense that it is allowed to generate successive estimates of the source output until the distortion criterion is met. The problem is to design the encoder and the decoder so as to minimize the average number of estimates until successful reconstruction. We derive estimates on nonnegative moments of the number of guesses, which are asymptotically tight as the length of the source block goes to infinity. Using the close relationship between guessing and sequential decoding, we give a tight lower bound to the complexity of sequential decoding in joint source-channel coding systems, complementing earlier works by Koshelev (1973) and Hellman (1975). Another topic explored here is the probability of error for list decoders with exponential list sizes for joint source-channel coding systems, for which we obtain tight bounds as well. It is noteworthy that optimal performance w.r.t. the performance measures considered here can be achieved in a manner that separates source coding and channel coding.

[1]  Katalin Marton,et al.  Error exponent for source coding with a fidelity criterion , 1974, IEEE Trans. Inf. Theory.

[2]  Imre Csiszár On the error exponent of source-channel transmission with a distortion threshold , 1982, IEEE Trans. Inf. Theory.

[3]  Martin E. Hellman Convolutional source encoding , 1975, IEEE Trans. Inf. Theory.

[4]  Thomas M. Cover,et al.  Elements of Information Theory , 2005 .

[5]  Robert Mario Fano,et al.  A heuristic discussion of probabilistic decoding , 1963, IEEE Trans. Inf. Theory.

[6]  S. Arimoto,et al.  Computational moments for sequential decoding of convolutional codes , 1979, IEEE Trans. Inf. Theory.

[7]  Neri Merhav,et al.  Guessing Subject to Distortion , 1998, IEEE Trans. Inf. Theory.

[8]  Elwyn R. Berlekamp,et al.  A lower bound to the distribution of computation for sequential decoding , 1967, IEEE Trans. Inf. Theory.

[9]  V. N. Koshelev Direct sequential encoding and decoding for discrete sources , 1973, IEEE Trans. Inf. Theory.

[10]  Robert G. Gallager,et al.  A simple derivation of the coding theorem and some applications , 1965, IEEE Trans. Inf. Theory.

[11]  Elwyn R. Berlekamp,et al.  Lower Bounds to Error Probability for Coding on Discrete Memoryless Channels. II , 1967, Inf. Control..

[12]  N. Sloane,et al.  Lower Bounds to Error Probability for Coding on Discrete Memoryless Channels. I , 1993 .

[13]  F. Jelinek Fast sequential decoding algorithm using a stack , 1969 .

[14]  G. David Forney,et al.  Exponential error bounds for erasure, list, and decision feedback schemes , 1968, IEEE Trans. Inf. Theory.

[15]  S. Lin,et al.  ON SEQUENTIAL DECODING , 1967 .

[16]  I. Csiszár Information Theory , 1981 .

[17]  Erdal Arikan An inequality on guessing and its application to sequential decoding , 1996, IEEE Trans. Inf. Theory.

[18]  A. Rényi On Measures of Entropy and Information , 1961 .