Optimal Point-to-Point Codes in Interference Channels: An Incremental I-MMSE approach

A recent result of the authors shows a so-called I-MMSE-like relationship that, for the two-user Gaussian interference channel, an I-MMSE relationship holds in the limit, as n $\to \infty$, between the interference and the interfered-with receiver, assuming that the interfered-with transmission is an optimal point-to-point sequence (achieves the point-to-point capacity). This result was further used to provide a proof of the "missing corner points" of the two-user Gaussian interference channel. This paper provides an information theoretic proof of the above-mentioned I-MMSE-like relationship which follows the incremental channel approach, an approach which was used by Guo, Shamai and Verd\'u to provide an insightful proof of the original I-MMSE relationship for point-to-point channels. Finally, some additional applications of this result are shown for other multi-user settings: the Gaussian multiple-access channel with interference and specific K-user Gaussian Z-interference channel settings.

[1]  Yihong Wu,et al.  Strong data processing inequalities in power-constrained Gaussian channels , 2015, 2015 IEEE International Symposium on Information Theory (ISIT).

[2]  Shlomo Shamai,et al.  Fading channels: How perfect need "Perfect side information" be? , 2002, IEEE Trans. Inf. Theory.

[3]  Shlomo Shamai,et al.  The effect of maximal rate codes on the interfering message rate , 2014, 2014 IEEE International Symposium on Information Theory.

[4]  Shlomo Shamai,et al.  Estimation in Gaussian Noise: Properties of the Minimum Mean-Square Error , 2010, IEEE Transactions on Information Theory.

[5]  Thomas M. Cover,et al.  Elements of Information Theory , 2005 .

[6]  Max H. M. Costa,et al.  On the Gaussian interference channel , 1985, IEEE Trans. Inf. Theory.

[7]  Shlomo Shamai,et al.  Mutual information and minimum mean-square error in Gaussian channels , 2004, IEEE Transactions on Information Theory.

[8]  Rudolf Ahlswede,et al.  Multi-way communication channels , 1973 .

[9]  Ram Zamir,et al.  A Proof of the Fisher Information Inequality via a Data Processing Argument , 1998, IEEE Trans. Inf. Theory.

[10]  Igal Sason,et al.  On achievable rate regions for the Gaussian interference channel , 2004, IEEE Transactions on Information Theory.

[11]  Sergio Verdú,et al.  On channel capacity per unit cost , 1990, IEEE Trans. Inf. Theory.

[12]  Sergio Verdú,et al.  Functional Properties of Minimum Mean-Square Error and Mutual Information , 2012, IEEE Transactions on Information Theory.

[13]  Sergio Verdú,et al.  Approximation theory of output statistics , 1993, IEEE Trans. Inf. Theory.

[14]  Fuzhen Zhang The Schur complement and its applications , 2005 .

[15]  Yihong Wu,et al.  Wasserstein Continuity of Entropy and Outer Bounds for Interference Channels , 2015, IEEE Transactions on Information Theory.

[16]  Shlomo Shamai,et al.  The Interplay Between Information and Estimation Measures , 2013, Found. Trends Signal Process..

[17]  Shlomo Shamai,et al.  Additive non-Gaussian noise channels: mutual information and conditional mean estimation , 2005, Proceedings. International Symposium on Information Theory, 2005. ISIT 2005..

[18]  Sergio Verdú,et al.  Spectral efficiency in the wideband regime , 2002, IEEE Trans. Inf. Theory.

[19]  Rüdiger L. Urbanke,et al.  A rate-splitting approach to the Gaussian multiple-access channel , 1996, IEEE Trans. Inf. Theory.

[20]  Shlomo Shamai,et al.  The empirical distribution of good codes , 1997, IEEE Trans. Inf. Theory.

[21]  Edward C. van der Meulen,et al.  An asymptotic expression for the information and capacity of a multidimensional channel with weak input signals , 1993, IEEE Trans. Inf. Theory.