Relative entropy at the channel output of a capacity-achieving code
暂无分享,去创建一个
[1] Shlomo Shamai,et al. The empirical distribution of good codes , 1997, IEEE Trans. Inf. Theory.
[2] I. Csiszár. $I$-Divergence Geometry of Probability Distributions and Minimization Problems , 1975 .
[3] Sergio Verdú,et al. Scalar coherent fading channel: Dispersion analysis , 2011, 2011 IEEE International Symposium on Information Theory Proceedings.
[4] S. Bobkov,et al. Concentration of the information in data with log-concave distributions , 2010, 1012.5457.
[5] M. Ledoux,et al. Isoperimetry and Gaussian analysis , 1996 .
[6] M. Ledoux. Concentration of measure and logarithmic Sobolev inequalities , 1999 .
[7] Imre Csiszár,et al. Information Theory - Coding Theorems for Discrete Memoryless Systems, Second Edition , 2011 .
[8] Sirin Nitinawarat,et al. On maximal error capacity regions of Symmetric Gaussian Multiple-Access Channels , 2011, 2011 IEEE International Symposium on Information Theory Proceedings.
[9] S. Bobkov,et al. Discrete isoperimetric and Poincaré-type inequalities , 1999 .
[10] H. Vincent Poor,et al. Channel Coding Rate in the Finite Blocklength Regime , 2010, IEEE Transactions on Information Theory.
[11] J. Wolfowitz. The coding of messages subject to chance errors , 1957 .
[12] H. Vincent Poor,et al. Channel coding: non-asymptotic fundamental limits , 2010 .
[13] Gregory W. Wornell,et al. Communication Under Strong Asynchronism , 2007, IEEE Transactions on Information Theory.
[14] S. Varadhan,et al. Asymptotic evaluation of certain Markov process expectations for large time , 1975 .