Trade-offs Between Error Exponents and Excess-Rate Exponents of Typical Slepian-Wolf Codes

Typical random codes (TRC) in a communication scenario of source coding with side information at the decoder is the main subject of this work. We derive the TRC error exponent for fixed-rate random binning and show that at relatively high rates, the TRC deviates significantly from the optimal code. We discuss the trade-offs between the error exponent and the excess-rate exponent for the typical random variable-rate code and characterize its optimal rate function. We show that the error exponent of the typical random variable-rate code may be strictly higher than in fixed-rate coding. We propose a new code, the semi-deterministic ensemble, which is a certain variant of the variable-rate code, and show that it dramatically improves upon the later: it is proved that the trade-off function between the error exponent and the excess-rate exponent for the typical random semi-deterministic code may be strictly higher than the same trade-off for the variable-rate code. Moreover, we show that the performance under optimal decoding can be attained also by different universal decoders: the minimum empirical entropy decoder and the generalized (stochastic) likelihood decoder with an empirical entropy metric.

[1]  Imre Csiszár,et al.  Information Theory - Coding Theorems for Discrete Memoryless Systems, Second Edition , 2011 .

[2]  Neri Merhav,et al.  Error Exponents of Typical Random Trellis Codes , 2019, IEEE Transactions on Information Theory.

[3]  Neri Merhav,et al.  Optimum Tradeoffs Between the Error Exponent and the Excess-Rate Exponent of Variable-Rate Slepian–Wolf Coding , 2015, IEEE Transactions on Information Theory.

[4]  Alexander Barg,et al.  Random codes: Minimum distances and error exponents , 2002, IEEE Trans. Inf. Theory.

[5]  Neri Merhav,et al.  Statistical Physics and Information Theory , 2010, Found. Trends Commun. Inf. Theory.

[6]  Rudolf Ahlswede,et al.  Good codes can be produced by a few permutations , 1982, IEEE Trans. Inf. Theory.

[7]  Neri Merhav,et al.  Large Deviations Behavior of the Logarithmic Error Probability of Random Codes , 2019, IEEE Transactions on Information Theory.

[8]  Neri Merhav,et al.  A Lagrange–Dual Lower Bound to the Error Exponent of the Typical Random Code , 2020, IEEE Transactions on Information Theory.

[9]  Anelia Somekh-Baruch,et al.  Generalized Random Gilbert-Varshamov Codes , 2018, IEEE Transactions on Information Theory.

[10]  Neri Merhav,et al.  Error Exponents of Typical Random Codes for the Colored Gaussian Channel , 2018, 2019 IEEE International Symposium on Information Theory (ISIT).

[11]  Neri Merhav,et al.  Error Exponents of Typical Random Codes of Source-Channel Coding , 2019, 2019 IEEE Information Theory Workshop (ITW).

[12]  Imre Csiszár,et al.  Graph decomposition: A new key to coding theorems , 1981, IEEE Trans. Inf. Theory.

[13]  Achilleas Anastasopoulos,et al.  Error Exponent for Multiple Access Channels: Upper Bounds , 2015, IEEE Transactions on Information Theory.

[14]  D. A. Bell,et al.  Information Theory and Reliable Communication , 1969 .

[15]  Jack K. Wolf,et al.  Noiseless coding of correlated information sources , 1973, IEEE Trans. Inf. Theory.

[16]  Jun Chen,et al.  On the Reliability Function of Variable-Rate Slepian-Wolf Coding , 2017, Entropy.

[17]  Neri Merhav,et al.  The generalized stochastic likelihood decoder: Random coding and expurgated bounds , 2015, 2016 IEEE International Symposium on Information Theory (ISIT).

[18]  Sergio Verdú,et al.  On α-decodability and α-likelihood decoder , 2017, 2017 55th Annual Allerton Conference on Communication, Control, and Computing (Allerton).

[19]  Neri Merhav,et al.  Error Exponents of Typical Random Codes , 2017, 2018 IEEE International Symposium on Information Theory (ISIT).