On the minimum mean p-th error in Gaussian noise channels and its applications

The problem of estimating an arbitrary random vector from its observation corrupted by additive white Gaussian noise, where the cost function is taken to be the minimum mean $p$ th error (MMPE), is considered. The classical minimum mean square error (MMSE) is a special case of the MMPE. Several bounds, properties, and applications of the MMPE are derived and discussed. The optimal MMPE estimator is found for Gaussian and binary input distributions. Properties of the MMPE as a function of the input distribution, signal-to-noise-ratio (SNR) and order $p$ are derived. The “single-crossing-point property” (SCPP) which provides an upper bound on the MMSE, and which together with the mutual information-MMSE relationship is a powerful tool in deriving converse proofs in multi-user information theory, is extended to the MMPE. Moreover, a complementary bound to the SCPP is derived. As a first application of the MMPE, a bound on the conditional differential entropy in terms of the MMPE is provided, which then yields a generalization of the Ozarow–Wyner lower bound on the mutual information achieved by a discrete input on a Gaussian noise channel. As a second application, the MMPE is shown to improve on previous characterizations of the phase transition phenomenon that manifests, in the limit as the length of the capacity achieving code goes to infinity, as a discontinuity of the MMSE as a function of SNR. As a final application, the MMPE is used to show new bounds on the second derivative of mutual information, or the first derivative of the MMSE.

[1]  Shlomo Shamai,et al.  On the SNR-Evolution of the MMSE Function of Codes for the Gaussian Broadcast and Wiretap Channels , 2016, IEEE Transactions on Information Theory.

[2]  Shlomo Shamai,et al.  Mutual information and minimum mean-square error in Gaussian channels , 2004, IEEE Transactions on Information Theory.

[3]  M. Powell,et al.  Approximation theory and methods , 1984 .

[4]  Liyi Dai,et al.  Wiener Filters in Gaussian Mixture Signal Estimation With \(\ell _\infty \) -Norm Error , 2014, IEEE Transactions on Information Theory.

[5]  Shlomo Shamai,et al.  MMSE of “Bad” Codes , 2013, IEEE Transactions on Information Theory.

[6]  Gerald B. Folland,et al.  Real Analysis: Modern Techniques and Their Applications , 1984 .

[7]  Shlomo Shamai,et al.  The Interplay Between Information and Estimation Measures , 2013, Found. Trends Signal Process..

[8]  Shlomo Shamai,et al.  Statistical Physics of Signal Estimation in Gaussian Noise: Theory and Examples of Phase Transitions , 2008, IEEE Transactions on Information Theory.

[9]  Dongning Guo,et al.  Relative entropy and score function: New information-estimation relationships through arbitrary additive perturbation , 2009, 2009 IEEE International Symposium on Information Theory.

[10]  Shlomo Shamai,et al.  On MMSE Crossing Properties and Implications in Parallel Vector Gaussian Channels , 2013, IEEE Transactions on Information Theory.

[11]  Sergio Verdú,et al.  Second-order asymptotics of mutual information , 2004, IEEE Transactions on Information Theory.

[12]  Shlomo Shamai,et al.  Estimation in Gaussian Noise: Properties of the Minimum Mean-Square Error , 2010, IEEE Transactions on Information Theory.

[13]  Sergio Verdú,et al.  Spectral efficiency in the wideband regime , 2002, IEEE Trans. Inf. Theory.