The Word Entropy and How to Compute It

The complexity function of an infinite word counts the number of its factors. For any positive function f, its exponential rate of growth \(E_0(f)\) is \(\lim \limits _{n\rightarrow \infty } \inf \frac{1}{n}\log f(n)\). We define a new quantity, the word entropy \(E_W(f)\), as the maximal exponential growth rate of a complexity function smaller than f. This is in general smaller than \(E_0(f)\), and more difficult to compute; we give an algorithm to estimate it. The quantity \(E_W(f)\) is used to compute the Hausdorff dimension of the set of real numbers whose expansions in a given base have complexity bounded by f.