Information Theory On extrinsic information of good binary codes operating over Gaussian channels
暂无分享,去创建一个
[1] Shlomo Shamai,et al. LDPC coded MIMO multiple access with iterative joint decoding , 2005, IEEE Transactions on Information Theory.
[2] Sergio Verdú,et al. Approximation theory of output statistics , 1993, IEEE Trans. Inf. Theory.
[3] A. Glavieux,et al. Near Shannon limit error-correcting coding and decoding: Turbo-codes. 1 , 1993, Proceedings of ICC '93 - IEEE International Conference on Communications.
[4] G. Kramer,et al. Code rate and the area under extrinsic information transfer curves , 2002, Proceedings IEEE International Symposium on Information Theory,.
[5] Shlomo Shamai,et al. Mutual information and minimum mean-square error in Gaussian channels , 2004, IEEE Transactions on Information Theory.
[6] Andrea Montanari,et al. Life Above Threshold: From List Decoding to Area Theorem and MSE , 2004, ArXiv.
[7] Shlomo Shamai,et al. Extremes of information combining , 2005, IEEE Transactions on Information Theory.
[8] Shlomo Shamai,et al. The empirical distribution of good codes , 1997, IEEE Trans. Inf. Theory.
[9] Shlomo Shamai,et al. Efficient Communication Over the Discrete-Time Memoryless Rayleigh Fading Channel with Turbo Coding/Decoding , 2000, Eur. Trans. Telecommun..
[10] Stephan ten Brink,et al. Designing Iterative Decoding Schemes with the Extrinsic Information Transfer Chart , 2001 .
[11] Krishna R. Narayanan,et al. An MSE Based Ttransfer Chart to Analyze Iterative Decoding Schemes , 2005, ArXiv.
[12] Shlomo Shamai,et al. On Extrinsic Information of Good Codes Operating Over Discrete Memoryless Channels , 2005, ArXiv.