Huffman codes and self-information

In this paper the connection between the self-information of a source letter from a finite alphabet and its code-word length in a Huffman code is investigated. Consider the set of all independent finite alphabet sources which contain a source letter a of probability p . The maximum over this set of the length of a Huffman codeword for a is determined. This maximum remains constant as p varies between the reciprocal values of two consecutive Fibonacci numbers. For the small p this maximum is approximately equal to \left[ \log_{2} \frac{1+ \sqrt{5}}{2} \right]^{-1} \approx 1.44 times the self-information.