Yet another entropy power inequality with an application
暂无分享,去创建一个
[1] Patrick P. Bergmans,et al. A simple converse for broadcast channels with additive white Gaussian noise (Corresp.) , 1974, IEEE Trans. Inf. Theory.
[2] Yasutada Oohama,et al. Rate-distortion theory for Gaussian multiterminal source coding systems with several side informations at the decoder , 2005, IEEE Transactions on Information Theory.
[3] A. J. Stam. Some Inequalities Satisfied by the Quantities of Information of Fisher and Shannon , 1959, Inf. Control..
[4] Amir Dembo,et al. Simple proof of the concavity of the entropy power with respect to Gaussian noise , 1989, IEEE Trans. Inf. Theory.
[5] Shlomo Shamai,et al. Mutual information and minimum mean-square error in Gaussian channels , 2004, IEEE Transactions on Information Theory.
[6] Daniel Pérez Palomar,et al. On optimal precoding in linear vector Gaussian channels with arbitrary input distribution , 2009, 2009 IEEE International Symposium on Information Theory.
[7] Shlomo Shamai,et al. A Vector Generalization of Costa's Entropy-Power Inequality With Applications , 2009, IEEE Transactions on Information Theory.
[8] Meritxell Lamarca,et al. Linear precoding for mutual information maximization in MIMO systems , 2009, 2009 6th International Symposium on Wireless Communication Systems.
[9] Sang Joon Kim,et al. A Mathematical Theory of Communication , 2006 .
[10] Antonia Maria Tulino,et al. Optimum power allocation for parallel Gaussian channels with arbitrary input distributions , 2006, IEEE Transactions on Information Theory.
[11] Charles R. Johnson,et al. Matrix analysis , 1985, Statistical Inference for Engineers and Data Scientists.
[12] J. Magnus,et al. Matrix Differential Calculus with Applications in Statistics and Econometrics , 1991 .
[13] Olivier Rioul,et al. Information Theoretic Proofs of Entropy Power Inequalities , 2007, IEEE Transactions on Information Theory.
[14] Emre Telatar,et al. Capacity of Multi-antenna Gaussian Channels , 1999, Eur. Trans. Telecommun..
[15] Max H. M. Costa,et al. A new entropy power inequality , 1985, IEEE Trans. Inf. Theory.
[16] Nelson M. Blachman,et al. The convolution inequality for entropy powers , 1965, IEEE Trans. Inf. Theory.
[17] Max H. M. Costa,et al. On the Gaussian interference channel , 1985, IEEE Trans. Inf. Theory.
[18] Daniel Pérez Palomar,et al. Gradient of mutual information in linear vector Gaussian channels , 2006, IEEE Transactions on Information Theory.
[19] Sergio Verdú,et al. A simple proof of the entropy-power inequality , 2006, IEEE Transactions on Information Theory.
[20] Cédric Villani,et al. A short proof of the "Concavity of entropy power" , 2000, IEEE Trans. Inf. Theory.
[21] Daniel Pérez Palomar,et al. Hessian and Concavity of Mutual Information, Differential Entropy, and Entropy Power in Linear Vector Gaussian Channels , 2009, IEEE Transactions on Information Theory.
[22] Amos Lapidoth,et al. Capacity bounds via duality with applications to multiple-antenna systems on flat-fading channels , 2003, IEEE Trans. Inf. Theory.
[23] Thomas M. Cover,et al. Elements of Information Theory , 2005 .
[24] Miguel R. D. Rodrigues,et al. MIMO Gaussian Channels With Arbitrary Inputs: Optimal Precoding and Power Allocation , 2010, IEEE Transactions on Information Theory.
[25] Sennur Ulukus,et al. Dependence Balance Based Outer Bounds for Gaussian Networks With Cooperation and Feedback , 2011, IEEE Transactions on Information Theory.