On the Accuracy of the High-SNR Approximation of the Differential Entropy of Signals in Additive Gaussian Noise: Real and Complex Cases

One approach to the analysis of the high signal-to-noise ratio (SNR) capacity of noncoherent wireless communication systems is to ignore the noise component of the received signal in the computation of its differential entropy. In this paper, we consider the error incurred by this approximation when the transmitter and the receiver have one antenna each and when the noise has a Gaussian distribution. We consider the complex and real cases, and we show that when the probability density function (pdf) of the signal component of the received signal is piecewise differentiable, the approximation error decays as 1/SNR, which tightens the available result that the error decays as o(1). In addition, we consider the special instance in which the signal component of the received signal corresponds to a signal transmitted over a channel with a Gaussian fading coefficient. For that case, we provide explicit expressions for the first nonconstant term of the Taylor expansion of the differential entropy, and we invoke Schwartz's inequality to obtain an efficiently computable bound on it. Our results are supported by numerical examples.

[1]  Mokshay M. Madiman,et al.  The entropy power of a sum is fractionally superadditive , 2009, 2009 IEEE International Symposium on Information Theory.

[2]  Thomas M. Cover,et al.  Elements of Information Theory , 2005 .

[3]  Alfred O. Hero,et al.  Convergence of differential entropies , 2002, IEEE Transactions on Information Theory.

[4]  Lizhong Zheng,et al.  Communication on the Grassmann manifold: A geometric approach to the noncoherent multiple-antenna channel , 2002, IEEE Trans. Inf. Theory.

[5]  John G. Proakis,et al.  Probability, random variables and stochastic processes , 1985, IEEE Trans. Acoust. Speech Signal Process..

[6]  Halim Yanikomeroglu,et al.  On the Accuracy of the High-SNR Approximation of the Differential Entropy of Signals in Additive Gaussian Noise: Real and Complex Cases , 2015, IEEE Trans. Veh. Technol..

[7]  Erik Ordentlich Maximizing the entropy of a sum of independent bounded random variables , 2006, IEEE Transactions on Information Theory.

[8]  A. Lapidoth,et al.  On the entropy of the sum and of the difference of independent random variables , 2008, 2008 IEEE 25th Convention of Electrical and Electronics Engineers in Israel.

[9]  Aarnout Brombacher,et al.  Probability... , 2009, Qual. Reliab. Eng. Int..

[10]  Ronald N. Bracewell,et al.  The Fourier Transform and Its Applications , 1966 .

[11]  S. Zienau Quantum Physics , 1969, Nature.

[12]  Mokshay M. Madiman,et al.  The entropies of the sum and the difference of two IID random variables are not too different , 2010, 2010 IEEE International Symposium on Information Theory.

[13]  Maria Huhtala,et al.  Random Variables and Stochastic Processes , 2021, Matrix and Tensor Decompositions in Signal Processing.

[14]  Erwin Riegler,et al.  On the Capacity of Large-MIMO Block-Fading Channels , 2012, IEEE Journal on Selected Areas in Communications.

[15]  A. Papoulis,et al.  The Fourier Integral and Its Applications , 1963 .

[16]  Zhen Zhang,et al.  On the maximum entropy of the sum of two dependent random variables , 1994, IEEE Trans. Inf. Theory.

[17]  Mokshay Madiman,et al.  On the entropy of sums , 2008, 2008 IEEE Information Theory Workshop.

[18]  Timothy N. Davidson,et al.  Noncoherent MIMO communication: Grassmannian constellations and efficient detection , 2004, International Symposium onInformation Theory, 2004. ISIT 2004. Proceedings..

[19]  Giuseppe Durisi On the Capacity of the Block-Memoryless Phase-Noise Channel , 2012, IEEE Communications Letters.

[20]  R. Cooke Real and Complex Analysis , 2011 .