Optimum Tradeoffs Between the Error Exponent and the Excess-Rate Exponent of Variable-Rate Slepian–Wolf Coding
暂无分享,去创建一个
[1] Imre Csiszár. Linear codes for sources and source networks: Error exponents, universal coding , 1982, IEEE Trans. Inf. Theory.
[2] Neri Merhav. On Optimum Parameter Modulation–Estimation From a Large Deviations Perspective , 2012, IEEE Transactions on Information Theory.
[3] Imre Csiszár,et al. Towards a general theory of source networks , 1980, IEEE Trans. Inf. Theory.
[4] Robert G. Gallager,et al. The random coding bound is tight for the average code (Corresp.) , 1973, IEEE Trans. Inf. Theory.
[5] Frederick Jelinek,et al. Buffer overflow in variable length coding of fixed rate sources , 1968, IEEE Trans. Inf. Theory.
[6] Stephen P. Boyd,et al. Convex Optimization , 2004, Algorithms and Theory of Computation Handbook.
[7] Imre Csiszár,et al. Information Theory and Statistics: A Tutorial , 2004, Found. Trends Commun. Inf. Theory.
[8] Te Sun Han,et al. Universal coding for the Slepian-Wolf data compression system and the strong converse theorem , 1994, IEEE Trans. Inf. Theory.
[9] Pierre A. Humblet. Generalization of Huffman coding to minimize the probability of buffer overflow , 1981, IEEE Trans. Inf. Theory.
[10] Aaron B. Wagner,et al. Improved Source Coding Exponents via Witsenhausen's Rate , 2011, IEEE Transactions on Information Theory.
[11] Richard E. Blahut,et al. Hypothesis testing and information theory , 1974, IEEE Trans. Inf. Theory.
[12] Frederick Jelinek,et al. Evaluation of expurgated bound exponents , 1968, IEEE Trans. Inf. Theory.
[13] Thomas M. Cover,et al. Elements of Information Theory (Wiley Series in Telecommunications and Signal Processing) , 2006 .
[14] Katalin Marton,et al. Error exponent for source coding with a fidelity criterion , 1974, IEEE Trans. Inf. Theory.
[15] Imre Csiszár. On the error exponent of source-channel transmission with a distortion threshold , 1982, IEEE Trans. Inf. Theory.
[16] James Richard Lesh. Computational algorithms for coding bound exponents. , 1976 .
[17] Imre Csiszár,et al. Information Theory - Coding Theorems for Discrete Memoryless Systems, Second Edition , 2011 .
[18] Imre Csiszár,et al. Graph decomposition: A new key to coding theorems , 1981, IEEE Trans. Inf. Theory.
[19] Rudolf Ahlswede,et al. Good codes can be produced by a few permutations , 1982, IEEE Trans. Inf. Theory.
[20] Jun Chen,et al. On the Linear Codebook-Level Duality Between Slepian–Wolf Coding and Channel Coding , 2009, IEEE Transactions on Information Theory.
[21] V.W.S. Chan,et al. Principles of Digital Communication and Coding , 1979 .
[22] Lizhong Zheng,et al. Error-and-Erasure Decoding for Block Codes with Feedback , 2008, ISIT.
[23] Charalambos D. Charalambous,et al. Robust coding for uncertain sources: a minimax approach , 2005, Proceedings. International Symposium on Information Theory, 2005. ISIT 2005..
[24] Shigeaki Kuzuoka. On the redundancy of variable-rate Slepian-Wolf coding , 2012, 2012 International Symposium on Information Theory and its Applications.
[25] M. Sion. On general minimax theorems , 1958 .
[26] Jun Chen,et al. On Universal Variable-Rate Slepian-Wolf Coding , 2008, 2008 IEEE International Conference on Communications.
[27] Jun Chen,et al. On the Reliability Function of Variable-Rate Slepian-Wolf Coding , 2017, Entropy.
[28] Neri Merhav. Erasure/List Exponents for Slepian–Wolf Decoding , 2014, IEEE Transactions on Information Theory.
[29] Suguru Arimoto. Computation of random coding exponent functions , 1976, IEEE Trans. Inf. Theory.
[30] Jun Chen,et al. On the Duality Between Slepian–Wolf Coding and Channel Coding Under Mismatched Decoding , 2006, IEEE Transactions on Information Theory.
[31] Shun Watanabe,et al. An information-spectrum approach to weak variable-length Slepian-Wolf coding , 2014, 2014 IEEE International Symposium on Information Theory.
[32] S. Sarvotham,et al. Variable-Rate Universal Slepian-Wolf Coding with Feedback , 2005, Conference Record of the Thirty-Ninth Asilomar Conference onSignals, Systems and Computers, 2005..
[33] Lizhong Zheng,et al. Errors-and-Erasures Decoding for Block Codes With Feedback , 2008, IEEE Transactions on Information Theory.
[34] Jun Chen,et al. On the Redundancy of Slepian–Wolf Coding , 2009, IEEE Transactions on Information Theory.
[35] Emre Telatar,et al. Exponential bounds for list size moments and error probability , 1998, 1998 Information Theory Workshop (Cat. No.98EX131).
[36] Rudolf Ahlswede,et al. Coloring hypergraphs: A new approach to multi-user source coding, 1 , 1979 .
[37] Neri Merhav,et al. On optimum strategies for minimizing the exponential moments of a loss function , 2012, 2012 IEEE International Symposium on Information Theory Proceedings.
[38] D. A. Bell,et al. Information Theory and Reliable Communication , 1969 .
[39] Jack K. Wolf,et al. Noiseless coding of correlated information sources , 1973, IEEE Trans. Inf. Theory.
[40] Evgueni Haroutunian,et al. Reliability Criteria in Information Theory and in Statistical Hypothesis Testing , 2008, Found. Trends Commun. Inf. Theory.
[41] William H. Press,et al. Numerical recipes in C , 2002 .
[42] Neri Merhav,et al. On Zero-Rate Error Exponents of Finite-State Channels With Input-Dependent States , 2014, IEEE Transactions on Information Theory.