On the Entropy of Context-Free Languages

The information theoretical concept of the entropy (channel capacity) of context-free languages and its relation to the structure generating function is investigated in the first part of this paper. The achieved results are applied to the family of pseudolinear grammars. In the second part, relations between context-free grammars, infinite labelled digraphs and infinite nonnegative matrices are exhibited. Theorems on the convergence parameter of infinite matrices are proved and applied to the evaluation of the entropy of certain context-free languages. Finally, a stochastic process is associated with any context-free language generated by a deterministic labelled digraph, such that the stochastic process is equivalent to the language in the sense that both have the same entropy.