On the applications of the minimum mean p-th error (MMPE) to information theoretic quantities

This paper considers the minimum mean p-th error (MMPE) estimation problem: estimating a random vector in the presence of additive white Gaussian noise (AWGN) in order to minimize an Lp norm of the estimation error. The MMPE generalizes the classical minimum mean square error (MMSE) estimation problem. This paper derives basic properties of the optimal MMPE estimator and MMPE functional. Optimal estimators are found for several inputs of interests, such as Gaussian and binary symbols. Under an appropriate p-th moment constraint, the Gaussian input is shown to be asymptotically the hardest to estimate for any p ≥ 1. By using a conditional version of the MMPE, the famous “MMSE single-crossing point” bound is shown to hold for the MMPE too for all p ≥ 1, up to a multiplicative constant. Finally, the paper develops connections between the conditional differential entropy and the MMPE, which leads to a tighter version of the Ozarow-Wyner lower bound on the rate achieved by discrete inputs on AWGN channels.

[1]  Shlomo Shamai,et al.  Information rates for a discrete-time Gaussian channel with intersymbol interference and stationary inputs , 1991, IEEE Trans. Inf. Theory.

[2]  Daniela Tuninetti,et al.  Interference as Noise: Friend or Foe? , 2015, IEEE Transactions on Information Theory.

[3]  Shlomo Shamai,et al.  On MMSE Crossing Properties and Implications in Parallel Vector Gaussian Channels , 2013, IEEE Transactions on Information Theory.

[4]  Thomas M. Cover,et al.  Elements of Information Theory: Cover/Elements of Information Theory, Second Edition , 2005 .

[5]  Shlomo Shamai,et al.  The Interplay Between Information and Estimation Measures , 2013, Found. Trends Signal Process..

[6]  Kenneth Rose,et al.  On Conditions for Linearity of Optimal Estimation , 2010, IEEE Transactions on Information Theory.

[7]  Neri Merhav,et al.  Lower Bounds on Parameter Modulation–Estimation Under Bandwidth Constraints , 2017, IEEE Transactions on Information Theory.

[8]  H. Vincent Poor,et al.  On the minimum mean p-th error in Gaussian noise channels and its applications , 2016, 2016 IEEE International Symposium on Information Theory (ISIT).

[9]  Shlomo Shamai,et al.  Estimation in Gaussian Noise: Properties of the Minimum Mean-Square Error , 2010, IEEE Transactions on Information Theory.

[10]  S. Verdú,et al.  The impact of constellation cardinality on Gaussian channel capacity , 2010, 2010 48th Annual Allerton Conference on Communication, Control, and Computing (Allerton).

[11]  Erwin Lutwak,et al.  Moment-Entropy Inequalities for a Random Vector , 2007, IEEE Transactions on Information Theory.

[12]  Aaron D. Wyner,et al.  On the capacity of the Gaussian channel with a finite number of input levels , 1990, IEEE Trans. Inf. Theory.