暂无分享,去创建一个
Fady Alajaji | Po-Ning Chen | Yunghsiang S. Han | Ling-Hua Chang | F. Alajaji | Po-Ning Chen | Y. Han | Ling-Hua Chang
[1] Marat V. Burnashev. On the BSC reliability function: Expanding the region where it is known exactly , 2015, Probl. Inf. Transm..
[2] Vincent Y. F. Tan,et al. The Bee-Identification Problem: Bounds on the Error Exponent , 2019, IEEE Transactions on Communications.
[3] Neri Merhav,et al. A Lagrange–Dual Lower Bound to the Error Exponent of the Typical Random Code , 2020, IEEE Transactions on Information Theory.
[4] Hsuan-Yin Lin,et al. Optimal Ultrasmall Block-Codes for Binary Discrete Memoryless Channels , 2013, IEEE Transactions on Information Theory.
[5] H. Vincent Poor,et al. A lower bound on the probability of error in multihypothesis testing , 1995, IEEE Trans. Inf. Theory.
[6] R. J. McEliece,et al. An improved upper bound on the block coding error exponent for binary input discrete memoryless channels , 1976 .
[7] Fady Alajaji,et al. A Generalized Poor-Verdú Error Bound for Multihypothesis Testing , 2012, IEEE Transactions on Information Theory.
[8] N. Sloane,et al. Lower Bounds to Error Probability for Coding on Discrete Memoryless Channels. I , 1993 .
[9] Robert J. McEliece,et al. An improved upper bound on the block coding error exponent for binary-input discrete memoryless channels (Corresp.) , 1977, IEEE Trans. Inf. Theory.
[10] D. A. Bell,et al. Information Theory and Reliable Communication , 1969 .
[11] Richard E. Blahut,et al. Principles and practice of information theory , 1987 .
[12] Fady Alajaji,et al. The Asymptotic Generalized Poor-Verdú Bound Achieves the BSC Error Exponent at Zero Rate , 2020, 2020 IEEE International Symposium on Information Theory (ISIT).
[13] Imre Csiszár,et al. Information Theory - Coding Theorems for Discrete Memoryless Systems, Second Edition , 2011 .
[14] Robert G. Gallager,et al. A simple derivation of the coding theorem and some applications , 1965, IEEE Trans. Inf. Theory.
[15] Adrià Tauste Campo,et al. Bayesian $M$ -Ary Hypothesis Testing: The Meta-Converse and Verdú-Han Bounds Are Tight , 2014, IEEE Transactions on Information Theory.
[16] Evgueni Haroutunian,et al. Reliability Criteria in Information Theory and in Statistical Hypothesis Testing , 2008, Found. Trends Commun. Inf. Theory.
[17] Albert Guillén i Fàbregas,et al. Mismatched Decoding: Error Exponents, Second-Order Rates and Saddlepoint Approximations , 2013, IEEE Transactions on Information Theory.
[18] Elwyn R. Berlekamp,et al. Lower Bounds to Error Probability for Coding on Discrete Memoryless Channels. II , 1967, Inf. Control..
[19] Yury Polyanskiy,et al. Saddle Point in the Minimax Converse for Channel Coding , 2013, IEEE Transactions on Information Theory.
[20] Sergio Verdú,et al. A general formula for channel capacity , 1994, IEEE Trans. Inf. Theory.
[21] A. Barg,et al. Distance distribution of binary codes and the error probability of decoding , 2004, IEEE Transactions on Information Theory.
[22] Marco Dalai,et al. Lower Bounds on the Probability of Error for Classical and Classical-Quantum Channels , 2012, IEEE Transactions on Information Theory.
[23] Vincent Y. F. Tan,et al. Bee-Identification Error Exponent with Absentee Bees , 2019, 2020 IEEE International Symposium on Information Theory (ISIT).
[24] Li Peng,et al. Expurgated Random-Coding Ensembles: Exponents, Refinements, and Connections , 2013, IEEE Transactions on Information Theory.
[25] H. Vincent Poor,et al. Channel Coding Rate in the Finite Blocklength Regime , 2010, IEEE Transactions on Information Theory.
[26] Simon Litsyn,et al. New Upper Bounds on Error Exponents , 1999, IEEE Trans. Inf. Theory.