Finite-state source-channel coding for individual sequences with source side information at the decoder

We study the following semi–deterministic setting of the joint source–channel coding problem: a deterministic source sequence (a.k.a. individual sequence) is transmitted via a memoryless channel, using delay-limited encoder and decoder, which are both implementable by periodically–varying finite-state machines, and the decoder is granted with access to side information, which is a noisy version of the source sequence. We first derive a lower bound on the achievable expected distortion in terms of the empirical statistics of the source sequence, the number of states of the encoder, the number of states of the decoder, their period, and the overall delay. The bound is shown to be asymptotically achievable by universal block codes in the limit of long blocks. We also derive a lower bound to the best achievable excess–distortion probability and discuss situations where it is achievable. Here, of course, source coding and channel coding cannot be completely separated without loss of optimality. Finally, we outline a few extensions of the model considered, such as: (i) incorporating a common reconstruction constraint, (ii) availability of side information at both ends, and (iii) extension to the Shannon channel with causal state information at the encoder. This work both extends and improves on earlier work of the same flavor (Ziv 1980, Merhav 2014), which focused only on the expected distortion, without side information at either end, and without the above mentioned additional ingredients.

[1]  Katalin Marton,et al.  Error exponent for source coding with a fidelity criterion , 1974, IEEE Trans. Inf. Theory.

[2]  Amos Lapidoth,et al.  Constrained Source-Coding With Side Information , 2014, IEEE Transactions on Information Theory.

[3]  Jacob Ziv Fixed-rate encoding of individual sequences with side information , 1984, IEEE Trans. Inf. Theory.

[4]  Andrew R. Barron,et al.  Information-theoretic asymptotics of Bayes methods , 1990, IEEE Trans. Inf. Theory.

[5]  Imre Csiszár On the error exponent of source-channel transmission with a distortion threshold , 1982, IEEE Trans. Inf. Theory.

[6]  Neri Merhav,et al.  Channel Coding in the Presence of Side Information , 2008, Found. Trends Commun. Inf. Theory.

[7]  Jacob Ziv,et al.  Coding theorems for individual sequences , 1978, IEEE Trans. Inf. Theory.

[8]  Abraham Lempel,et al.  Compression of individual sequences via variable-rate coding , 1978, IEEE Trans. Inf. Theory.

[9]  Neri Merhav,et al.  On universal simulation of information sources using training data , 2004, IEEE Transactions on Information Theory.

[10]  Neri Merhav Universal detection of messages via finite-state channels , 2000, IEEE Trans. Inf. Theory.

[11]  R. Gray,et al.  A new class of lower bounds to information rates of stationary sources via conditional rate-distortion functions , 1973, IEEE Trans. Inf. Theory.

[12]  Abraham Lempel,et al.  Compression of two-dimensional data , 1986, IEEE Trans. Inf. Theory.

[13]  Neri Merhav,et al.  On the Wyner-Ziv problem for individual sequences , 2006, IEEE Transactions on Information Theory.

[14]  Ram Zamir,et al.  The rate loss in the Wyner-Ziv problem , 1996, IEEE Trans. Inf. Theory.

[15]  Yossef Steinberg,et al.  Coding and Common Reconstruction , 2009, IEEE Transactions on Information Theory.

[16]  Shlomo Shamai,et al.  On joint source-channel coding for the Wyner-Ziv source and the Gel'fand-Pinsker channel , 2003, IEEE Trans. Inf. Theory.

[17]  Thomas M. Cover,et al.  Elements of Information Theory , 2005 .

[18]  Jacob Ziv,et al.  Universal decoding for finite-state channels , 1985, IEEE Trans. Inf. Theory.

[19]  Abraham Lempel,et al.  On the Complexity of Finite Sequences , 1976, IEEE Trans. Inf. Theory.

[20]  Claude E. Shannon,et al.  Channels with Side Information at the Transmitter , 1958, IBM J. Res. Dev..

[21]  Shlomo Shamai,et al.  Systematic Lossy Source/Channel Coding , 1998, IEEE Trans. Inf. Theory.

[22]  Kevin Atteson,et al.  The asymptotic redundancy of Bayes rules for Markov chains , 1999, IEEE Trans. Inf. Theory.

[23]  Jacob Ziv,et al.  Distortion-rate theory for individual sequences , 1980, IEEE Trans. Inf. Theory.

[24]  Neri Merhav On the Data Processing Theorem in the Semi–deterministic Setting , 2014, IEEE Transactions on Information Theory.

[25]  D. A. Bell,et al.  Information Theory and Reliable Communication , 1969 .

[26]  Aaron D. Wyner,et al.  The rate-distortion function for source coding with side information at the decoder , 1976, IEEE Trans. Inf. Theory.

[27]  N. Sloane,et al.  Lower Bounds to Error Probability for Coding on Discrete Memoryless Channels. I , 1993 .