Guessing and entropy

It is shown that the average number of successive guesses, E[G], required with an optimum strategy until one correctly guesses the value of a discrete random X, is underbounded by the entropy H(X) in the manner E[G]/spl ges/( 1/4 )2/sup H(X/)+1 provided that H(X)/spl ges/2 bits. This bound is tight within a factor of (4/e) when X is geometrically distributed. It is further shown that E[G] may be arbitrarily large when H(X) is an arbitrarily small positive number so that there is no interesting upper bound on E[G] in terms of H(X).<<ETX>>