New bounds on the expected length of one-to-one codes

We provide new bounds on the expected length L of a binary one-to-one code for a discrete random variable X with entropy H. We prove that L/spl ges/H-log(H+1)-Hlog(1+1/H). This bound improves on previous results. Furthermore, we provide upper bounds on the expected length of the best code as function of H and the most likely source letter probability.

[1]  A. Orlitsky,et al.  A lower bound on the expected length of one-to-one codes , 1994, Proceedings of 1994 IEEE International Symposium on Information Theory.

[2]  Aaron D. Wyner,et al.  An Upper Bound on the Entropy Series , 1972, Inf. Control..

[3]  Erik I. Verriest An achievable bound for optimal noiseless coding of a random variable , 1986, IEEE Trans. Inf. Theory.

[4]  C. E. SHANNON,et al.  A mathematical theory of communication , 1948, MOCO.

[5]  Thomas M. Cover,et al.  Some equivalences between Shannon entropy and Kolmogorov complexity , 1978, IEEE Trans. Inf. Theory.