Huffman Redundancy for Large Alphabet Sources
暂无分享,去创建一个
[1] W. Rudin. Principles of mathematical analysis , 1964 .
[2] Gyula O. H. Katona,et al. Huffman codes and self-information , 1976, IEEE Trans. Inf. Theory.
[3] Alfredo De Santis,et al. Tight upper bounds on the redundancy of Huffman codes , 1989, IEEE Trans. Inf. Theory.
[4] T. Aaron Gulliver,et al. Near-Optimality of the Minimum Average Redundancy Code for Almost All Monotone Sources , 2011, IEICE Trans. Fundam. Electron. Commun. Comput. Sci..
[5] T. Aaron Gulliver,et al. The Minimum Average Code for Finite Memoryless Monotone Sources , 2007, IEEE Transactions on Information Theory.
[6] T. Aaron Gulliver,et al. How Suboptimal Is the Shannon Code? , 2013, IEEE Transactions on Information Theory.
[7] Alfredo De Santis,et al. New bounds on the redundancy of Huffman codes , 1991, IEEE Trans. Inf. Theory.
[8] David C. van Voorhis,et al. Optimal source codes for geometrically distributed integer alphabets (Corresp.) , 1975, IEEE Trans. Inf. Theory.
[9] Wojciech Szpankowski,et al. Asymptotic average redundancy of Huffman (and other) block codes , 2000, IEEE Trans. Inf. Theory.
[10] Raymond W. Yeung,et al. A simple upper bound on the redundancy of Huffman codes , 2002, IEEE Trans. Inf. Theory.
[11] Ugo Vaccaro,et al. Bounding the average length of optimal source codes via majorization theory , 2004, IEEE Transactions on Information Theory.
[12] L. Devroye. Non-Uniform Random Variate Generation , 1986 .
[13] David A. Huffman,et al. A method for the construction of minimum-redundancy codes , 1952, Proceedings of the IRE.