Using counterexamples, we show that vocabulary size and static and dynamic branching factors are all inadequate as measures of speech recognition complexity of finite state grammars. Information theoretic arguments show that perplexity (the logarithm of which is the familiar entropy) is a more appropriate measure of equivalent choice. It too has certain weaknesses which we discuss. We show that perplexity can also be applied to languages having no obvious statistical description, since an entropy‐maximizing probability assignment can be found for any finite‐state grammar. Table I shows perplexity values for some well‐known speech recognition tasks. Perplexity Vocabulary Dynamic Phone Word size branching factorIBM‐Lasers 2.14 21.11 1000 1000IBM‐Raleigh 1.69 7.74 250 7.32CMU‐AIX05 1.52 6.41 1011 35