Worst additive noise: An information-estimation view

The “worst additive noise” problem is considered. The problem refers to an additive channel in which the input is known to some extent. It is further assumed that the noise consists of an additive Gaussian component and an additive component of arbitrary distribution. The question is: what is the distribution over the additive noise that will minimize the mutual information between the input and the output? Two settings for this problem are considered. In the first setting a Gaussian input with a given covariance matrix is considered and it is shown that the problem can be handled in the framework of the Guo, Shamai and Verdu' I-MMSE relationship. This framework gives a simple derivation of Diggavi and Cover's result, that under a covariance constraint the “worst additive noise” distribution is Gaussian, meaning that Gaussian noise minimizes the input-output mutual information given that the input is Gaussian. The I-MMSE framework also shows that given that the input is Gaussian distributed, for any constraint on the distribution of the noise, which does not prohibit a Gaussian distribution, the “worst” distribution is a Gaussian distribution complying with the constraint. In the second setting it is assumed that the input contains a codeword from an optimal point-to-point codebook (i.e., it achieves capacity) and it is shown, for a subset of SNRs, that the minimum mutual information is obtained when the additive signal is Gaussian-like up to a given SNR.

[1]  Shlomo Shamai,et al.  Mutual information and minimum mean-square error in Gaussian channels , 2004, IEEE Transactions on Information Theory.

[2]  Amos Lapidoth,et al.  Nearest neighbor decoding for additive non-Gaussian noise channels , 1996, IEEE Trans. Inf. Theory.

[3]  Shlomo Shamai,et al.  On MMSE Crossing Properties and Implications in Parallel Vector Gaussian Channels , 2013, IEEE Transactions on Information Theory.

[4]  Sergio Verdú,et al.  Approximation theory of output statistics , 1993, IEEE Trans. Inf. Theory.

[5]  Shlomo Shamai,et al.  Proof of Entropy Power Inequalities Via MMSE , 2006, 2006 IEEE International Symposium on Information Theory.

[6]  Shlomo Shamai,et al.  An MMSE Approach to the Secrecy Capacity of the MIMO Gaussian Wiretap Channel , 2009, 2009 IEEE International Symposium on Information Theory.

[7]  Shlomo Shamai,et al.  The effect of maximal rate codes on the interfering message rate , 2014, 2014 IEEE International Symposium on Information Theory.

[8]  Daniel Pérez Palomar,et al.  Gradient of mutual information in linear vector Gaussian channels , 2005, ISIT.

[9]  Shlomo Shamai,et al.  Estimation in Gaussian Noise: Properties of the Minimum Mean-Square Error , 2010, IEEE Transactions on Information Theory.

[10]  Shlomo Shamai,et al.  Statistical Physics of Signal Estimation in Gaussian Noise: Theory and Examples of Phase Transitions , 2008, IEEE Transactions on Information Theory.

[11]  Shlomo Shamai,et al.  Information Theory On extrinsic information of good binary codes operating over Gaussian channels , 2007, Eur. Trans. Telecommun..

[12]  Sergio Verdú,et al.  Functional Properties of Minimum Mean-Square Error and Mutual Information , 2012, IEEE Transactions on Information Theory.

[13]  Shlomo Shamai,et al.  The Interplay Between Information and Estimation Measures , 2013, Found. Trends Signal Process..

[14]  Suhas N. Diggavi,et al.  The worst additive noise under a covariance constraint , 2001, IEEE Trans. Inf. Theory.

[15]  A. Lee Swindlehurst,et al.  Full Rank Solutions for the MIMO Gaussian Wiretap Channel With an Average Power Constraint , 2012, IEEE Transactions on Signal Processing.

[16]  Charalambos D. Charalambous,et al.  On optimal signaling over secure MIMO channels , 2012, 2012 IEEE International Symposium on Information Theory Proceedings.