On the Reliability Function of the Discrete Memoryless Relay Channel

Bounds on the reliability function for the discrete memoryless relay channel are derived using the method of types. Two achievable error exponents are derived based on partial decode-forward and compress-forward, which are well-known superposition block-Markov coding schemes. The derivations require combinations of the techniques involved in the proofs of Csiszár-Körner-Marton's packing lemma for the error exponent of channel coding and Marton's type covering lemma for the error exponent of source coding with a fidelity criterion. The decode-forward error exponent is evaluated on Sato's relay channel. From this example, it is noted that to obtain the fastest possible decay in the error probability for a fixed effective coding rate, one ought to optimize the number of blocks in the block-Markov coding scheme assuming the blocklength within each block is large. An upper bound on the reliability function is also derived using ideas from Haroutunian's lower bound on the error probability for point-to-point channel coding with feedback.

[1]  Raymond Knopp,et al.  Error exponents for backhaul-constrained parallel relay networks , 2010, 21st Annual IEEE International Symposium on Personal, Indoor and Mobile Radio Communications.

[2]  Robert G. Gallager,et al.  The random coding bound is tight for the average code (Corresp.) , 1973, IEEE Trans. Inf. Theory.

[3]  Michael Gastpar,et al.  Cooperative strategies and capacity theorems for relay networks , 2005, IEEE Transactions on Information Theory.

[4]  Elwyn R. Berlekamp,et al.  Lower Bounds to Error Probability for Coding on Discrete Memoryless Channels. II , 1967, Inf. Control..

[5]  Randall Berry,et al.  Reliability constrained packet-sizing for linear multi-hop wireless networks , 2008, 2008 IEEE International Symposium on Information Theory.

[6]  Aaron B. Wagner,et al.  Moderate deviation analysis of channel coding: Discrete memoryless case , 2010, 2010 IEEE International Symposium on Information Theory.

[7]  Arash Behboodi,et al.  On the asymptotic spectrum of the error probability of composite networks , 2012, 2012 IEEE Information Theory Workshop.

[8]  M. Studený,et al.  The Multiinformation Function as a Tool for Measuring Stochastic Dependence , 1998, Learning in Graphical Models.

[9]  N. Sloane,et al.  Lower Bounds to Error Probability for Coding on Discrete Memoryless Channels. I , 1993 .

[10]  Stark C. Draper,et al.  On reliability of content identification from databases based on noisy queries , 2011, 2011 IEEE International Symposium on Information Theory Proceedings.

[11]  Abbas El Gamal,et al.  Capacity theorems for the relay channel , 1979, IEEE Trans. Inf. Theory.

[12]  J. Nicholas Laneman,et al.  Error exponents for block Markov superposition encoding with varying decoding latency , 2012, 2012 IEEE Information Theory Workshop.

[13]  Ying Wang,et al.  Capacity and Random-Coding Exponents for Channel Coding With Side Information , 2007, IEEE Transactions on Information Theory.

[14]  Albert Guillén i Fàbregas,et al.  Ensemble-tight error exponents for mismatched decoders , 2012, 2012 50th Annual Allerton Conference on Communication, Control, and Computing (Allerton).

[15]  Sae-Young Chung,et al.  Noisy Network Coding , 2010, IEEE Transactions on Information Theory.

[16]  Baris Nakiboglu,et al.  Sphere-packing bound for block-codes with feedback and finite memory , 2010, 2010 IEEE International Symposium on Information Theory.

[17]  Abbas El Gamal,et al.  Bounds on capacity and minimum energy-per-bit for AWGN relay channels , 2006, IEEE Transactions on Information Theory.

[18]  Aaron B. Wagner,et al.  Reliability in Source Coding With Side Information , 2011, IEEE Transactions on Information Theory.

[19]  Himanshu Tyagi,et al.  The Gelfand-Pinsker channel: Strong converse and upper bound for the reliability function , 2009, 2009 IEEE International Symposium on Information Theory.

[20]  O. F. Cook The Method of Types , 1898 .

[21]  Robert G. Gallager,et al.  A perspective on multiaccess channels , 1984, IEEE Trans. Inf. Theory.

[22]  Neri Merhav,et al.  Relations Between Random Coding Exponents and the Statistical Physics of Random Codes , 2007, IEEE Transactions on Information Theory.

[23]  Hiroki Koga,et al.  Information-Spectrum Methods in Information Theory , 2002 .

[24]  Harikrishna Palaiyanur,et al.  The impact of causality on information-theoretic source and channel coding problems , 2011 .

[25]  Aydano B. Carleial,et al.  Multiple-access channels with different generalized feedback signals , 1982, IEEE Trans. Inf. Theory.

[26]  Imre Csiszár,et al.  Information Theory - Coding Theorems for Discrete Memoryless Systems, Second Edition , 2011 .

[27]  Qiang Li,et al.  On the error exponent of the wideband relay channel , 2006, 2006 14th European Signal Processing Conference.

[28]  Vincent Y. F. Tan,et al.  Moderate-deviations of lossy source coding for discrete and Gaussian sources , 2011, 2012 IEEE International Symposium on Information Theory Proceedings.

[29]  Masahito Hayashi,et al.  Information Spectrum Approach to Second-Order Coding Rate in Channel Coding , 2008, IEEE Transactions on Information Theory.

[30]  Frans M. J. Willems,et al.  The discrete memoryless multiple-access channel with cribbing encoders , 1985, IEEE Trans. Inf. Theory.

[31]  Imre Csiszár,et al.  Graph decomposition: A new key to coding theorems , 1981, IEEE Trans. Inf. Theory.

[32]  Lars K. Rasmussen,et al.  Delay-exponent of decode-forward streaming , 2013, 2013 IEEE International Symposium on Information Theory.

[33]  Albert Guillén i Fàbregas,et al.  Multiuser Random Coding Techniques for Mismatched Decoding , 2013, IEEE Transactions on Information Theory.

[34]  Thomas M. Cover,et al.  Network Information Theory , 2001 .

[35]  Toby Berger,et al.  Rate distortion theory : a mathematical basis for data compression , 1971 .

[36]  Hyundong Shin,et al.  Amplify-and-Forward Two-Way Relay Networks: Error Exponents and Resource Allocation , 2010, IEEE Transactions on Communications.

[37]  Federico Kuhlmann,et al.  Achievability proof of some multiuser channel coding theorems using backward decoding , 1989, IEEE Trans. Inf. Theory.

[38]  Albert Guillén i Fàbregas,et al.  An achievable error exponent for the mismatched multiple-access channel , 2012, 2012 50th Annual Allerton Conference on Communication, Control, and Computing (Allerton).

[39]  E. Meulen,et al.  Three-terminal communication channels , 1971, Advances in Applied Probability.

[40]  Abbas El Gamal,et al.  Network Information Theory , 2021, 2021 IEEE 3rd International Conference on Advanced Trends in Information Theory (ATIT).

[41]  Shunsuke Ihara Error Exponent for Coding of Memoryless Gaussian Sources with a Fidelity Criterion , 2000 .

[42]  János Körner,et al.  Universally attainable error exponents for broadcast channels with degraded message sets , 1980, IEEE Trans. Inf. Theory.

[43]  Aaron D. Wyner,et al.  The rate-distortion function for source coding with side information at the decoder , 1976, IEEE Trans. Inf. Theory.

[44]  D. A. Bell,et al.  Information Theory and Reliable Communication , 1969 .

[45]  Evgueni Haroutunian,et al.  Reliability Criteria in Information Theory and in Statistical Hypothesis Testing , 2008, Found. Trends Commun. Inf. Theory.

[46]  Brian L. Hughes,et al.  A new universal random coding bound for the multiple-access channel , 1996, IEEE Trans. Inf. Theory.

[47]  Katalin Marton,et al.  Error exponent for source coding with a fidelity criterion , 1974, IEEE Trans. Inf. Theory.

[48]  Arash Behboodi,et al.  On the asymptotic error probability of composite relay channels , 2011, 2011 IEEE International Symposium on Information Theory Proceedings.

[49]  Urbashi Mitra,et al.  Multihopping Strategies: An Error-Exponent Comparison , 2007, 2007 IEEE International Symposium on Information Theory.