Second-order asymptotics for the Gaussian interference channel with strictly very strong interference

The second-order asymptotics of the Gaussian interference channel in the strictly very strong interference regime are considered. The rates of convergence to a given point on the boundary of the (first-order) capacity region are determined. These rates are expressed in terms of the average probability of error and variances of selected modified information densities which coincide with the dispersion of the (single-user) Gaussian channel. Interestingly, under the strictly very strong interference assumption, the intuition that receivers can decode messages from non-intended transmitters carries over to the second-order analysis.

[1]  Mehul Motani,et al.  On the dispersions of the discrete memoryless interference channel , 2013, 2013 IEEE International Symposium on Information Theory.

[2]  Vincent Y. F. Tan,et al.  Second-order asymptotics for the gaussian MAC with degraded message sets , 2013, 2014 IEEE International Symposium on Information Theory.

[3]  Vincent Y. F. Tan,et al.  On the dispersions of three network information theory problems , 2012, CISS.

[4]  J. Nicholas Laneman,et al.  A Finite-Blocklength Perspective on Gaussian Multi-Access Channels , 2013, ArXiv.

[5]  Vincent Y. F. Tan,et al.  The third-order term in the normal approximation for the AWGN channel , 2014, 2014 IEEE International Symposium on Information Theory.

[6]  F. Götze On the Rate of Convergence in the Multivariate CLT , 1991 .

[7]  Masahito Hayashi,et al.  Information Spectrum Approach to Second-Order Coding Rate in Channel Coding , 2008, IEEE Transactions on Information Theory.

[8]  Mehul Motani,et al.  On The Han–Kobayashi Region for theInterference Channel , 2008, IEEE Transactions on Information Theory.

[9]  R. Ahlswede An elementary proof of the strong converse theorem for the multiple-access channel , 1982 .

[10]  H. Vincent Poor,et al.  Channel Coding Rate in the Finite Blocklength Regime , 2010, IEEE Transactions on Information Theory.

[11]  U. Erez,et al.  A note on the dispersion of network problems , 2012, 2012 IEEE 27th Convention of Electrical and Electronics Engineers in Israel.

[12]  Mehul Motani,et al.  A Case Where Interference Does Not Affect the Channel Dispersion , 2014, IEEE Transactions on Information Theory.

[13]  Te Sun Han,et al.  Second-order Slepian-Wolf coding theorems for non-mixed and mixed sources , 2012, 2013 IEEE International Symposium on Information Theory.

[14]  Amiel Feinstein,et al.  A new basic theorem of information theory , 1954, Trans. IRE Prof. Group Inf. Theory.

[15]  Aydano B. Carleial,et al.  A case where interference does not reduce capacity (Corresp.) , 1975, IEEE Trans. Inf. Theory.

[16]  R. A. McDonald,et al.  Noiseless Coding of Correlated Information Sources , 1973 .

[17]  Sergio Verdú,et al.  A general formula for channel capacity , 1994, IEEE Trans. Inf. Theory.

[18]  Te Sun Han,et al.  A new achievable rate region for the interference channel , 1981, IEEE Trans. Inf. Theory.

[19]  Vincent Yan Fu Tan,et al.  Non-asymptotic and second-order achievability bounds for source coding with side-information , 2013, 2013 IEEE International Symposium on Information Theory.