Conveying individual source sequences over memoryless channels using finite-state decoders with source side information

We study the following semi–deterministic setting of the joint source–channel coding problem: a deterministic source sequence (a.k.a. individual sequence) is transmitted via a memoryless channel, and decoded by a delay–limited, finite–state decoder with access to side information, which is a noisy version of the source sequence. We first prove a separation theorem with regard to the expected distortion between the source sequence and its reconstruction, which is an extension of earlier work of the same flavor (Ziv 1980, Merhav 2014), but without decoder side information. We then derive a lower bound to the best achievable excess–distortion probability and discuss situations where it is achievable. Here, of course, source coding and channel coding cannot be completely separated if one wishes to meet the bound. Finally, we outline a few variations and extensions of the model considered, such as: (i) incorporating a common reconstruction constraint, (ii) availability of side information at both ends, and (iii) extension to the Gel’fand-Pinsker channel.

[1]  D. A. Bell,et al.  Information Theory and Reliable Communication , 1969 .

[2]  Andrew R. Barron,et al.  Information-theoretic asymptotics of Bayes methods , 1990, IEEE Trans. Inf. Theory.

[3]  Imre Csiszár On the error exponent of source-channel transmission with a distortion threshold , 1982, IEEE Trans. Inf. Theory.

[4]  Neri Merhav Universal detection of messages via finite-state channels , 2000, IEEE Trans. Inf. Theory.

[5]  Shlomo Shamai,et al.  On joint source-channel coding for the Wyner-Ziv source and the Gel'fand-Pinsker channel , 2003, IEEE Trans. Inf. Theory.

[6]  Neri Merhav,et al.  On the Wyner-Ziv problem for individual sequences , 2006, IEEE Transactions on Information Theory.

[7]  R. Gray,et al.  A new class of lower bounds to information rates of stationary sources via conditional rate-distortion functions , 1973, IEEE Trans. Inf. Theory.

[8]  Abraham Lempel,et al.  Compression of two-dimensional data , 1986, IEEE Trans. Inf. Theory.

[9]  Shlomo Shamai,et al.  Systematic Lossy Source/Channel Coding , 1998, IEEE Trans. Inf. Theory.

[10]  Abraham Lempel,et al.  On the Complexity of Finite Sequences , 1976, IEEE Trans. Inf. Theory.

[11]  Thomas M. Cover,et al.  Elements of Information Theory , 2005 .

[12]  Jacob Ziv,et al.  Universal decoding for finite-state channels , 1985, IEEE Trans. Inf. Theory.

[13]  Aaron D. Wyner,et al.  The rate-distortion function for source coding with side information at the decoder , 1976, IEEE Trans. Inf. Theory.

[14]  Yossef Steinberg,et al.  Coding and Common Reconstruction , 2009, IEEE Transactions on Information Theory.

[15]  N. Sloane,et al.  Lower Bounds to Error Probability for Coding on Discrete Memoryless Channels. I , 1993 .

[16]  Abraham Lempel,et al.  Compression of individual sequences via variable-rate coding , 1978, IEEE Trans. Inf. Theory.

[17]  Katalin Marton,et al.  Error exponent for source coding with a fidelity criterion , 1974, IEEE Trans. Inf. Theory.

[18]  Neri Merhav,et al.  Channel Coding in the Presence of Side Information , 2008, Found. Trends Commun. Inf. Theory.

[19]  Jacob Ziv,et al.  Coding theorems for individual sequences , 1978, IEEE Trans. Inf. Theory.

[20]  Ram Zamir,et al.  The rate loss in the Wyner-Ziv problem , 1996, IEEE Trans. Inf. Theory.

[21]  Jacob Ziv,et al.  Distortion-rate theory for individual sequences , 1980, IEEE Trans. Inf. Theory.

[22]  Neri Merhav On the Data Processing Theorem in the Semi–deterministic Setting , 2014, IEEE Transactions on Information Theory.

[23]  Jacob Ziv Fixed-rate encoding of individual sequences with side information , 1984, IEEE Trans. Inf. Theory.