An alternative to decoding interference or treating interference as Gaussian noise

This paper addresses the following question regarding Gaussian networks: Is there an alternative to decoding interference or treating interference as Gaussian noise? By answering this question we aim to establish a benchmark for practical systems where multiuser decoding is not a common practice. To state our result, we study a decentralized network of one Primary User (PU) and one Secondary User (SU) modeled by a two-user Gaussian interference channel. The primary transmitter is constellation-based, i.e., PU is equipped with a modulator and its code-book is constructed over a modulation signal set. SU utilizes random Gaussian codewords with controlled transmission power that guarantees a certain level of Interference-to-Noise Ratio (INR) at the primary receiver. Both users are unaware of each other's code-book, however, SU is smart in the sense that it is aware of the constellation set of PU. While interference at the primary receiver is modeled as additive Gaussian noise, the secondary receiver can utilize the structure of PU's modulator as side information to decode its message without decoding the message of PU. The instantaneous realizations of symbols in a codeword transmitted by PU are unknown to both ends of SU's direct link, however, the sample space of such symbols is available to SU. This makes the interference plus noise at the secondary receiver be a mixed Gaussian process. Invoking entropy power inequality and an upper bound on the differential entropy of a mixed Gaussian vector, we develop an achievable rate for SU that is robust to the structure of PU's modulation signal set and only depends on its constellation size and the dimension of the euclidean space that the constellation points lie in. Moreover, we obtain an achievable rate for PU using Fano's inequality in conjunction with a Gallager-type upper bound on the probability of error in decoding constellation points at the primary receiver. The developed achievable rates for PU and SU enable us to show that the sum rate can be improved compared to a scenario where both users employ Gaussian codewords and treat each other as Gaussian noise.

[1]  Gábor Lugosi,et al.  Concentration Inequalities - A Nonasymptotic Theory of Independence , 2013, Concentration Inequalities.

[2]  Kamyar Moshksar,et al.  On Gaussian Interference Channels with Constellation-Based Transmitters , 2012, IEEE Communications Letters.

[3]  Daniela Tuninetti,et al.  Inner and Outer Bounds for the Gaussian Cognitive Interference Channel and New Capacity Results , 2010, IEEE Transactions on Information Theory.

[4]  Kamyar Moshksar,et al.  Randomized Resource Allocation in Decentralized Wireless Networks , 2011, IEEE Transactions on Information Theory.

[5]  Abbas El Gamal,et al.  Lecture Notes on Network Information Theory , 2010, ArXiv.

[6]  Lizhong Zheng,et al.  Coding along Hermite polynomials for Gaussian noise channels , 2009, 2009 IEEE International Symposium on Information Theory.

[7]  Venugopal V. Veeravalli,et al.  Gaussian Interference Networks: Sum Capacity in the Low-Interference Regime and New Outer Bounds on the Capacity Region , 2008, IEEE Transactions on Information Theory.

[8]  Amir K. Khandani,et al.  Capacity bounds for the Gaussian Interference Channel , 2008, 2008 IEEE International Symposium on Information Theory.

[9]  Gerhard Kramer,et al.  A New Outer Bound and the Noisy-Interference Sum–Rate Capacity for Gaussian Interference Channels , 2007, IEEE Transactions on Information Theory.

[10]  Pramod Viswanath,et al.  Cognitive Radio: An Information-Theoretic Perspective , 2009, IEEE Transactions on Information Theory.

[11]  Hua Wang,et al.  Gaussian Interference Channel Capacity to Within One Bit , 2007, IEEE Transactions on Information Theory.

[12]  Amir Ghasemi,et al.  Fundamental limits of spectrum-sharing in fading environments , 2007, IEEE Transactions on Wireless Communications.

[13]  Michael Gastpar,et al.  On Capacity Under Receive and Spatial Spectrum-Sharing Constraints , 2007, IEEE Transactions on Information Theory.

[14]  Patrick Mitran,et al.  Achievable rates in cognitive radio channels , 2006, IEEE Transactions on Information Theory.

[15]  Suhas N. Diggavi,et al.  The worst additive noise under a covariance constraint , 2001, IEEE Trans. Inf. Theory.

[16]  Joseph Mitola,et al.  Cognitive Radio An Integrated Agent Architecture for Software Defined Radio , 2000 .

[17]  Amos Lapidoth,et al.  Nearest neighbor decoding for additive non-Gaussian noise channels , 1996, IEEE Trans. Inf. Theory.

[18]  Thomas M. Cover,et al.  Elements of Information Theory , 2005 .

[19]  R. Gray Entropy and Information Theory , 1990, Springer New York.

[20]  Dudley,et al.  Real Analysis and Probability: Measurability: Borel Isomorphism and Analytic Sets , 2002 .

[21]  John G. Proakis,et al.  Probability, random variables and stochastic processes , 1985, IEEE Trans. Acoust. Speech Signal Process..

[22]  J.E. Mazo,et al.  Digital communications , 1985, Proceedings of the IEEE.

[23]  Te Sun Han,et al.  A new achievable rate region for the interference channel , 1981, IEEE Trans. Inf. Theory.

[24]  Rudolf Ahlswede,et al.  Multi-way communication channels , 1973 .

[25]  D. A. Bell,et al.  Information Theory and Reliable Communication , 1969 .

[26]  R. Gallager Information Theory and Reliable Communication , 1968 .