A CORRECTION CONCERNING COMPLEXITY

In an earlier paper1 defined the complexity of a proposition H as equal to the amount of information in it, that is, as I(H) = — log P(H). It disturbed me that the proposition 0 = 1 , which looks simple, is thus attributed infinite complexity, but I accepted this implication because this proposition implies all other propositions. K. S. Friedman (private communication) has now pointed out an objection to my postulate that H. Kis more complex than H. He shows that it leads quickly to the conclusion that A V B is simpler than A, which is counter-intuitive. The postulate would be reasonable if H and K were entirely independent of each other (in fact it is reasonable to assume that the complexities are additive in this case), but I must now withdraw the definition of complexity as given. Fortunately this has very little affect on the rest of the paper. The definition might make sense if restricted to a conjunction of 'atomic propositions', or if P(H) is interpreted not as that of the proposition H, but as the maximum probability, regarded as a linguistic text, of an expression of H. This probability could be evaluated approximately by regarding the language as a Markov chain of any finite order. Such approximations are familiar to cryptanalysts and are mentioned, for example, in Shannon's basic paper on communication theory. For more details, see my forthcoming paper 'Explicativity, corroboration, and the relative odds of hypotheses', in the proceedings of the conference on 'Methodologies: Bayesian and Popperian', Columbia, South Carolina, November 1973. As mentioned in my paper, Vale*ry (1921) described a figure as 'geometric' if it can be traced by motions which can be expressed in few words. Perhaps he obtained this idea from Emile Lemoine's book Gdomdtrographie* where measures of complexity of geometrical constructions are given. See also J. L. Coolidge, A Treatise on the Circle and Sphere.