A new one-pass algorithm for constructing dynamic Huffman codes is introduced and analyzed. We also analyze the one-pass algorithm due to Faller, Gallager, and Knuth. In each algorithm, both the sender and the receiver maintain equivalent dynamically varying Huffman trees, and the coding is done in real time. We show that the number of bits used by the new algorithm to encode a message containing t letters is < t bits more than that used by the conventional two-pass Huffman scheme, independent of the alphabet size. This is best possible in the worst case, for any one-pass Huffman method. Tight upper and lower bounds are derived. Empirical tests show that the encodings produced by the new algorithm are shorter than those of the other one-pass algorithm and, except for long messages, are shorter than those of the two-pass method. The new algorithm is well suited for on-line encoding/decoding in data networks and for tile compression.
[1]
Peter Elias,et al.
Interval and recency rank source coding: Two on-line adaptive variable-length schemes
,
1987,
IEEE Trans. Inf. Theory.
[2]
Eugene S. Schwartz,et al.
An Optimum Encoding with Minimum Longest Code and Total Number of Digits
,
1964,
Inf. Control..
[3]
Robert G. Gallager,et al.
Variations on a theme by Huffman
,
1978,
IEEE Trans. Inf. Theory.
[4]
Robert E. Tarjan,et al.
A Locally Adaptive Data
,
1986
.
[5]
Wen-Chin Chen,et al.
Design and Analysis of Coalesced Hashing
,
1986
.
[6]
David A. Huffman,et al.
A method for the construction of minimum-redundancy codes
,
1952,
Proceedings of the IRE.
[7]
Donald E. Knuth,et al.
Dynamic Huffman Coding
,
1985,
J. Algorithms.