On communications through a Gaussian noise channel with an MMSE disturbance constraint

This paper considers a Gaussian channel with one transmitter and two receivers. The goal is to maximize the communication rate at the intended/primary receiver subject to a disturbance constraint at the unintended/secondary receiver. The disturbance is measured in terms of minimum mean square error (MMSE) of the interference that the transmission to the primary receiver inflicts on the secondary receiver. The paper presents a new upper bound for the problem of maximizing the mutual information subject to an MMSE constraint. The new bound holds for vector inputs of any length and recovers a previously known limiting (when the length for vector input tends to infinity) expression from the work of Bustin et al. The key technical novelty is a new upper bound on MMSE. This new bound allows one to bound the MMSE for all signal-to-noise ratio (SNR) values below a certain SNR at which the MMSE is known (which corresponds to the disturbance constraint). This new bound complements the ‘single-crossing point property’ of the MMSE that upper bounds the MMSE for all SNR values above a certain value at which the MMSE value is known. The new MMSE upper bound provides a refined characterization of the phase-transition phenomenon which manifests, in the limit as the length of the vector input goes to infinity, as a discontinuity of the MMSE for the problem at hand. A matching lower bound, to within an additive gap of order O (log log 1/MMSE) (where MMSE is the disturbance constraint), is shown by means of the mixed inputs recently introduced by Dytso et al.

[1]  Daniela Tuninetti,et al.  Interference as Noise: Friend or Foe? , 2015, IEEE Transactions on Information Theory.

[2]  Te Sun Han,et al.  A new achievable rate region for the interference channel , 1981, IEEE Trans. Inf. Theory.

[3]  Daniela Tuninetti,et al.  On the Two-User Interference Channel With Lack of Knowledge of the Interference Codebook at One Receiver , 2015, IEEE Transactions on Information Theory.

[4]  Aaas News,et al.  Book Reviews , 1893, Buffalo Medical and Surgical Journal.

[5]  Gerhard Kramer,et al.  A New Outer Bound and the Noisy-Interference Sum–Rate Capacity for Gaussian Interference Channels , 2007, IEEE Transactions on Information Theory.

[6]  Abbas El Gamal,et al.  Communication With Disturbance Constraints , 2014, IEEE Transactions on Information Theory.

[7]  Hiroshi Sato,et al.  The capacity of the Gaussian interference channel under strong interference , 1981, IEEE Trans. Inf. Theory.

[8]  Shlomo Shamai,et al.  Statistical Physics of Signal Estimation in Gaussian Noise: Theory and Examples of Phase Transitions , 2008, IEEE Transactions on Information Theory.

[9]  Antonia Maria Tulino,et al.  Optimum power allocation for parallel Gaussian channels with arbitrary input distributions , 2006, IEEE Transactions on Information Theory.

[10]  Shlomo Shamai,et al.  On MMSE Crossing Properties and Implications in Parallel Vector Gaussian Channels , 2013, IEEE Transactions on Information Theory.

[11]  Shlomo Shamai,et al.  MMSE of “Bad” Codes , 2013, IEEE Transactions on Information Theory.

[12]  Shlomo Shamai,et al.  Mutual information and minimum mean-square error in Gaussian channels , 2004, IEEE Transactions on Information Theory.

[13]  Kamyar Moshksar,et al.  An alternative to decoding interference or treating interference as Gaussian noise , 2011, 2011 IEEE International Symposium on Information Theory Proceedings.

[14]  A. Robert Calderbank,et al.  Soft-Decoding-Based Strategies for Relay and Interference Channels: Analysis and Achievable Rates Using LDPC Codes , 2010, IEEE Transactions on Information Theory.

[15]  Hua Wang,et al.  Gaussian Interference Channel Capacity to Within One Bit , 2007, IEEE Transactions on Information Theory.

[16]  Shlomo Shamai,et al.  Estimation in Gaussian Noise: Properties of the Minimum Mean-Square Error , 2010, IEEE Transactions on Information Theory.