暂无分享,去创建一个
[1] Shlomo Shamai,et al. On information rates for mismatched decoders , 1994, IEEE Trans. Inf. Theory.
[2] Joseph M. Renes,et al. Universal polar codes for more capable and less noisy channels and sources , 2013, 2014 IEEE International Symposium on Information Theory.
[3] Imre Csiszár,et al. Information Theory and Statistics: A Tutorial , 2004, Found. Trends Commun. Inf. Theory.
[4] Albert Guillén i Fàbregas,et al. Extremes of random coding error exponents , 2011, 2011 IEEE International Symposium on Information Theory Proceedings.
[5] Werner Hürlimann,et al. Extremal Moment Methods and Stochastic Orders , 2008 .
[6] D. Blackwell. Equivalent Comparisons of Experiments , 1953 .
[7] Mine Alsan. Performance of mismatched polar codes over BSCs , 2012, 2012 International Symposium on Information Theory and its Applications.
[8] M. F.,et al. Bibliography , 1985, Experimental Gerontology.
[9] Mine Alsan. Extremality for Gallager’s Reliability Function $E_{0}$ , 2015, IEEE Transactions on Information Theory.
[10] Albert Guillén i Fàbregas,et al. Extremes of Error Exponents , 2012, IEEE Transactions on Information Theory.
[11] Thomas M. Cover,et al. Network Information Theory , 2001 .
[12] Emre Telatar,et al. On the rate of channel polarization , 2008, 2009 IEEE International Symposium on Information Theory.
[13] Thomas M. Fischer. Some Remarks on the Role of Inaccuracy in Shannon’s Theory of Information Transmission , 1978 .
[14] D. A. Bell,et al. Information Theory and Reliable Communication , 1969 .
[15] S. Shamai,et al. Information rates and error exponents of compound channels with application to antipodal signaling in a fading environment , 1993 .
[16] Robert Mario Fano,et al. A heuristic discussion of probabilistic decoding , 1963, IEEE Trans. Inf. Theory.
[17] Erdal Arikan,et al. Channel Polarization: A Method for Constructing Capacity-Achieving Codes for Symmetric Binary-Input Memoryless Channels , 2008, IEEE Transactions on Information Theory.
[18] Rüdiger L. Urbanke,et al. Polar Codes for Channel and Source Coding , 2009, ArXiv.
[19] Robert G. Gallager,et al. A simple derivation of the coding theorem and some applications , 1965, IEEE Trans. Inf. Theory.
[20] Samuel Karlin,et al. Generalized convex inequalities , 1963 .
[21] Toshiyuki Tanaka,et al. Performance of polar codes with the construction using density evolution , 2009, IEEE Communications Letters.
[22] Vladimir B. Balakirsky. A converse coding theorem for mismatched decoding at the output of binary-input memoryless channels , 1995, IEEE Trans. Inf. Theory.
[23] Alexander Vardy,et al. List decoding of polar codes , 2011, 2011 IEEE International Symposium on Information Theory Proceedings.
[24] J. Massey. Guessing and entropy , 1994, Proceedings of 1994 IEEE International Symposium on Information Theory.
[25] D. Blackwell,et al. The Capacity of a Class of Channels , 1959 .
[26] Alexander Vardy,et al. Hardware architectures for successive cancellation decoding of polar codes , 2010, 2011 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP).
[27] Lizhong Zheng,et al. Linear Universal Decoding for Compound Channels , 2010, IEEE Transactions on Information Theory.
[28] Emre Telatar,et al. On the construction of polar codes , 2011, 2011 IEEE International Symposium on Information Theory Proceedings.
[29] A. Rényi. On Measures of Entropy and Information , 1961 .
[30] Daniel J. Costello,et al. Channel coding: The road to channel capacity , 2006, Proceedings of the IEEE.
[31] Mine Alsan. The Symmetric Convex Ordering: A Novel Partial Order for B-DMCs Ordering the Information Sets of Polar Codes , 2013 .
[32] Claude E. Shannon,et al. The zero error capacity of a noisy channel , 1956, IRE Trans. Inf. Theory.
[33] V.W.S. Chan,et al. Principles of Digital Communication and Coding , 1979 .
[34] C. E. SHANNON,et al. A mathematical theory of communication , 1948, MOCO.
[35] Anne H. Soukhanov,et al. The american heritage dictionary of the english language , 1992 .
[36] Alexander Vardy,et al. How to Construct Polar Codes , 2011, IEEE Transactions on Information Theory.
[37] Aaron D. Wyner,et al. A theorem on the entropy of certain binary sequences and applications-II , 1973, IEEE Trans. Inf. Theory.
[38] Claude E. Shannon,et al. A Note on a Partial Ordering for Communication Channels , 1958, Information and Control.
[39] Erdal Arikan. Channel combining and splitting for cutoff rate improvement , 2006, IEEE Transactions on Information Theory.
[40] David Williams,et al. Probability with Martingales , 1991, Cambridge mathematical textbooks.
[41] Brendan J. Frey,et al. Factor graphs and the sum-product algorithm , 2001, IEEE Trans. Inf. Theory.
[42] Mine Alsan. Properties of the polarization transformations for the likelihood ratios of symmetric B-DMCs , 2013, 2013 13th Canadian Workshop on Information Theory.
[43] Aaron D. Wyner,et al. A theorem on the entropy of certain binary sequences and applications-I , 1973, IEEE Trans. Inf. Theory.
[44] Robert M. Gray,et al. Coding for noisy channels , 2011 .
[45] Suguru Arimoto,et al. On the converse to the coding theorem for discrete memoryless channels (Corresp.) , 1973, IEEE Trans. Inf. Theory.
[46] Mine Alsan,et al. Extremal Channels of Gallager's $E_{0}$ Under the Basic Polarization Transformations , 2014, IEEE Transactions on Information Theory.
[47] Erdal Arikan. Channel combining and splitting for cutoff rate improvement , 2005, Proceedings. International Symposium on Information Theory, 2005. ISIT 2005..
[48] Erdal Arikan. An inequality on guessing and its application to sequential decoding , 1996, IEEE Trans. Inf. Theory.
[49] Imre Csiszár,et al. Channel capacity for a given decoding metric , 1995, IEEE Trans. Inf. Theory.
[50] Mine Alsan. Extremality properties for Gallager's random coding exponent , 2012, 2012 IEEE International Symposium on Information Theory Proceedings.
[51] Johannes B. Huber,et al. Bounds on information combining , 2005, IEEE Transactions on Information Theory.
[52] R. Szekli. Stochastic Ordering and Dependence in Applied Probability , 1995 .
[53] Shlomo Shamai,et al. Extremes of information combining , 2005, IEEE Transactions on Information Theory.
[54] Imre Csiszár. Generalized cutoff rates and Renyi's information measures , 1995, IEEE Trans. Inf. Theory.
[55] Private Communications , 2001 .
[56] G. Crooks. On Measures of Entropy and Information , 2015 .
[57] Vladimir B. Balakirsky. Coding Theorem for Discrete Memoryless Channels with Given Decision Rule , 1991, Algebraic Coding.
[58] Imre Csiszár,et al. Information Theory - Coding Theorems for Discrete Memoryless Systems, Second Edition , 2011 .