Error Exponents of Optimum Decoding for the Interference Channel

Exponential error bounds for the finite-alphabet interference channel (IFC) with two transmitter-receiver pairs, are investigated under the random coding regime. Our focus is on optimum decoding, as opposed to heuristic decoding rules that have been used in previous works, like joint typicality decoding, decoding based on interference cancellation, and decoding that considers the interference as additional noise. Indeed, the fact that the actual interfering signal is a codeword and not an independent and identically distributed (i.i.d.) noise process complicates the application of conventional techniques to the performance analysis of the optimum decoder. Using analytical tools rooted in statistical physics, we derive a single-letter expression for error exponents achievable under optimum decoding and demonstrate strict improvement over error exponents obtainable using suboptimal decoding rules, but which are amenable to more conventional analysis.

[1]  D. A. Bell,et al.  Information Theory and Reliable Communication , 1969 .

[2]  Achilleas Anastasopoulos,et al.  A new universal random-coding bound for average probability error exponent for multiple-access channels , 2009, 2009 43rd Annual Conference on Information Sciences and Systems.

[3]  Axthonv G. Oettinger,et al.  IEEE Transactions on Information Theory , 1998 .

[4]  Brian L. Hughes,et al.  A new universal random coding bound for the multiple-access channel , 1996, IEEE Trans. Inf. Theory.

[5]  Neri Merhav,et al.  Error exponents of optimum decoding for the interference channel , 2010, IEEE Trans. Inf. Theory.

[6]  Hua Wang,et al.  Gaussian Interference Channel Capacity to Within One Bit , 2007, IEEE Transactions on Information Theory.

[7]  Te Sun Han,et al.  A new achievable rate region for the interference channel , 1981, IEEE Trans. Inf. Theory.

[8]  Neri Merhav,et al.  Error Exponents of Erasure/List Decoding Revisited Via Moments of Distance Enumerators , 2007, IEEE Transactions on Information Theory.

[9]  S. Kak Information, physics, and computation , 1996 .

[10]  Toby Berger,et al.  Review of Information Theory: Coding Theorems for Discrete Memoryless Systems (Csiszár, I., and Körner, J.; 1981) , 1984, IEEE Trans. Inf. Theory.

[11]  Achilleas Anastasopoulos,et al.  Error Exponent Regions for Gaussian Broadcast and Multiple-Access Channels , 2008, IEEE Transactions on Information Theory.

[12]  Aydano B. Carleial,et al.  A case where interference does not reduce capacity (Corresp.) , 1975, IEEE Trans. Inf. Theory.

[13]  Erik Ordentlich,et al.  Discrete Memoryless Interference Channel: New Outer Bound , 2007, 2007 IEEE International Symposium on Information Theory.

[14]  Hans-Martin Wallmeier,et al.  Random coding bound and codes produced by permutations for the multiple-access channel , 1985, IEEE Trans. Inf. Theory.

[15]  Neri Merhav,et al.  Error Exponents for Broadcast Channels With Degraded Message Sets , 2011, IEEE Transactions on Information Theory.

[16]  Neri Merhav,et al.  Relations Between Random Coding Exponents and the Statistical Physics of Random Codes , 2007, IEEE Transactions on Information Theory.

[17]  Imre Csiszár,et al.  Information Theory - Coding Theorems for Discrete Memoryless Systems, Second Edition , 2011 .

[18]  Cheng Chang,et al.  Interference channel capacity region for randomized fixed-composition codes , 2009, 2009 47th Annual Allerton Conference on Communication, Control, and Computing (Allerton).

[19]  N. S. Barnett,et al.  Private communication , 1969 .