On the applications of the minimum mean p-th error (MMPE) to information theoretic quantities
暂无分享,去创建一个
[1] Shlomo Shamai,et al. Information rates for a discrete-time Gaussian channel with intersymbol interference and stationary inputs , 1991, IEEE Trans. Inf. Theory.
[2] Daniela Tuninetti,et al. Interference as Noise: Friend or Foe? , 2015, IEEE Transactions on Information Theory.
[3] Shlomo Shamai,et al. On MMSE Crossing Properties and Implications in Parallel Vector Gaussian Channels , 2013, IEEE Transactions on Information Theory.
[4] Thomas M. Cover,et al. Elements of Information Theory: Cover/Elements of Information Theory, Second Edition , 2005 .
[5] Shlomo Shamai,et al. The Interplay Between Information and Estimation Measures , 2013, Found. Trends Signal Process..
[6] Kenneth Rose,et al. On Conditions for Linearity of Optimal Estimation , 2010, IEEE Transactions on Information Theory.
[7] Neri Merhav,et al. Lower Bounds on Parameter Modulation–Estimation Under Bandwidth Constraints , 2017, IEEE Transactions on Information Theory.
[8] H. Vincent Poor,et al. On the minimum mean p-th error in Gaussian noise channels and its applications , 2016, 2016 IEEE International Symposium on Information Theory (ISIT).
[9] Shlomo Shamai,et al. Estimation in Gaussian Noise: Properties of the Minimum Mean-Square Error , 2010, IEEE Transactions on Information Theory.
[10] S. Verdú,et al. The impact of constellation cardinality on Gaussian channel capacity , 2010, 2010 48th Annual Allerton Conference on Communication, Control, and Computing (Allerton).
[11] Erwin Lutwak,et al. Moment-Entropy Inequalities for a Random Vector , 2007, IEEE Transactions on Information Theory.
[12] Aaron D. Wyner,et al. On the capacity of the Gaussian channel with a finite number of input levels , 1990, IEEE Trans. Inf. Theory.