Codewords With Memory Improve Achievable Rate Regions of the Memoryless Gaussian Interference Channel

The two-user Gaussian interference channel (GIC) has been extensively studied in the literature during the last four decades. The full characterization of the capacity region of the GIC is a long-standing open problem, except the case of strong or very strong interference. For general GIC's, many inner bounds have been provided over the years, among of them, the Han-Kobayashi (HK) region, is the most celebrated one. Unfortunately, the calculation of the HK region is prohibitively complex, due to the appearance of some auxiliary random variables, whose optimal choice is an open problem. As in other multi-user communication systems, these achievable regions are based on ensembles of i.i.d. (memoryless) codewords, in the sense that the symbols within each codeword are drawn independently. In this paper, we show that for the GIC, it is worthwhile to employ random coding ensembles of codewords with memory. Specifically, we take known achievable regions for the GIC, and generalize/improve them by allowing dependency between the code symbols. For example, we improve the state-of-the-art HK region by drawing the codewords (of each codeword and for each user) from a first-order autoregressive moving average (ARMA) Gaussian process. In this way, we suggest several new achievable rate regions, which are easily calculable, and which are strictly better than state-of-the-art known achievable regions.

[1]  L. Goddard Information Theory , 1962, Nature.

[2]  U. Grenander,et al.  Toeplitz Forms And Their Applications , 1958 .

[3]  Igal Sason On the corner points of the capacity region of a two-user Gaussian interference channel , 2014, ISIT.

[4]  Igal Sason,et al.  On achievable rate regions for the Gaussian interference channel , 2004, IEEE Transactions on Information Theory.

[5]  Aydano B. Carleial,et al.  A case where interference does not reduce capacity (Corresp.) , 1975, IEEE Trans. Inf. Theory.

[6]  J. Massey,et al.  Communications and Cryptography: Two Sides of One Tapestry , 1994 .

[7]  Abbas El Gamal,et al.  Network Information Theory , 2021, 2021 IEEE 3rd International Conference on Advanced Trends in Information Theory (ATIT).

[8]  Thomas M. Cover,et al.  Network Information Theory , 2001 .

[9]  Mehul Motani,et al.  On The Han–Kobayashi Region for theInterference Channel , 2008, IEEE Transactions on Information Theory.

[10]  Hiroshi Sato,et al.  Two-user communication channels , 1977, IEEE Trans. Inf. Theory.

[11]  Claude E. Shannon,et al.  Two-way Communication Channels , 1961 .

[12]  Aydano B. Carleial,et al.  Interference channels , 1978, IEEE Trans. Inf. Theory.

[13]  Isidore Isaac Hirschman,et al.  Studies in real and complex analysis , 1965 .

[14]  Edward C. van der Meulen,et al.  Some Reflections On The Interference Channel , 1994 .

[15]  Shlomo Shamai,et al.  The effect of maximal rate codes on the interfering message rate , 2014, 2014 IEEE International Symposium on Information Theory.

[16]  Albrecht Böttcher,et al.  Spectral properties of banded Toeplitz matrices , 1987 .

[17]  R. Ahlswede The Capacity Region of a Channel with Two Senders and Two Receivers , 1974 .

[18]  Xiaohu Shang,et al.  Two-User Gaussian Interference Channels: An Information Theoretic Point of View , 2013 .

[19]  Te Sun Han,et al.  A new achievable rate region for the interference channel , 1981, IEEE Trans. Inf. Theory.

[20]  Max H. M. Costa,et al.  On the Gaussian interference channel , 1985, IEEE Trans. Inf. Theory.

[21]  Yihong Wu,et al.  Wasserstein Continuity of Entropy and Outer Bounds for Interference Channels , 2015, IEEE Transactions on Information Theory.

[22]  Hiroshi Sato,et al.  The capacity of the Gaussian interference channel under strong interference , 1981, IEEE Trans. Inf. Theory.

[23]  Robert M. Gray,et al.  Toeplitz and Circulant Matrices: A Review , 2005, Found. Trends Commun. Inf. Theory.

[24]  Chandra Nair,et al.  Sub-optimality of Han-Kobayashi achievable region for interference channels , 2015, 2015 IEEE International Symposium on Information Theory (ISIT).