Turing and randomness

In an unpublished manuscript, Turing anticipated the basic ideas behind the theory of algorithmic randomness. He did so by nearly 30 years. Turing used a computationally constrained version of “measure theory” to answer a question of Borel in number theory. This question concerned constructing what are called “absolutely normal” numbers. In this article, I will try to explain what these mysterious terms mean, and what Turing did. 1 Borel, number theory and normality 1.1 Repeated decimals in fractions Mathematicians have always been fascinated with patterns in numbers. At a very early stage of our education, we learn about the special nature of about decimal expansions of rational numbers. Recall that a real number is rational if it is a fraction: it can be expressed as pq for some integers p, q. The reader might remember from school, or maybe first year university, that numbers like √ 2 are not rational, and it can be shown that “most” numbers (in a precise mathematical sense) are irrational. Long ago, the Greeks showed that a real number between 0 and 1 is rational if and only if it has a finite decimal expansion, or a decimal expansion which repeats from some point onwards. For example, 14 = 0 · 25, and 3 7 = ·428571428571428571.... The astute reader will notice that we need a bit of care with the 1 4 case as we could also think of it as 0 · .24999999999999999999..... On the other hand, this alternative expansion does fit the bill about repeating from some point onwards. For simplicity, we will ignore such ambiguities. The reader might also remember that we can count using different bases. Binary is a standard such base, where each place in the representation represents a power of 2. For example, 7 = 2 + 2 + 1, and hence in binary, 7 would be written as 111. In base 3 we only use 0, 1, 2, and express using powers of 3. In base 3, 7 becomes 21 which is 2 × 3 + 1. Note that bases can be bigger than 10, and we would have to invent symbols to represent the larger “digits”. You may have ? Supported by the Marsden Fund of New Zealand. 1 In fact using base 10 is a relatively recent convention. noticed the ISBN code on a book. The ISBN code on a book is base 11, and uses x to represent 10. The Greek result about repeats in the decimal representation of rationals remains true if we change the base from base 10 to any base. For instance, in base 3, 14 = ·02020202... In the discussion below, we will henceforth drop the decimal point and be concerned with infinite sequences. 1.2 Borel and normality In 1909, Émil Borel [8] was interested in sequences which satisfied the law of large numbers. This law says that if you repeat an experiment many times, then the average value should be the expected value. If you toss a fair coin many times you expect as many heads half of the time. In base 10, this law says that frequency of choosing a digit between 0 and 9 is exactly what you would expect in the limit, namely 1 10 . What do we mean by “the limit”? Base 2 corresponds to tossing a coin. Over time, you would expect as many heads as tails. But this is only the eventual long term behaviour. If I toss a head, the next toss is independent of this toss. So with probability 1 2 I would again get a head. But we expect that if the coin was fair, the law of large numbers states that it will all “even out in the long run”. Take such a sequence X representing an infinite collection of coin tosses. After s coin tosses, we can see how we are going so far by looking at how many heads we have seen in the first s tosses compared to the total number of tosses so far. We could examine the ratio at step s: |{X(k) = 1 | k ≤ s}|

[1]  M. Borel Les probabilités dénombrables et leurs applications arithmétiques , 1909 .

[2]  P. Erdös,et al.  Note on normal numbers , 1946 .

[3]  Rodney G. Downey,et al.  Algorithmic Randomness and Complexity , 2010, Theory and Applications of Computability.

[4]  Ray J. Solomonoff,et al.  A Formal Theory of Inductive Inference. Part I , 1964, Inf. Control..

[5]  Alexander Shen,et al.  Ergodic-Type Characterizations of Algorithmic Randomness , 2010, CiE.

[6]  A. Turing On Computable Numbers, with an Application to the Entscheidungsproblem. , 1937 .

[7]  Jonathan P. Bowen,et al.  Turing's legacy , 2017, The Turing Guide.

[8]  Ray J. Solomonoff,et al.  A Formal Theory of Inductive Inference. Part II , 1964, Inf. Control..

[9]  Ming Li,et al.  An Introduction to Kolmogorov Complexity and Its Applications , 2019, Texts in Computer Science.

[10]  A. Church On the concept of a random sequence , 1940 .

[11]  Claus-Peter Schnorr,et al.  Zufälligkeit und Wahrscheinlichkeit - Eine algorithmische Begründung der Wahrscheinlichkeitstheorie , 1971, Lecture Notes in Mathematics.

[12]  Verónica Becher,et al.  Turing's Normal Numbers: Towards Randomness , 2012, CiE.

[13]  É. Borel Leçons sur la théorie des fonctions , 2009 .

[14]  Gregory J. Chaitin,et al.  A recent technical report , 1974, SIGA.

[15]  Mark Braverman,et al.  Non-computable Julia sets , 2004, ArXiv.

[16]  Santiago Figueira,et al.  An example of a computable absolutely normal number , 2002, Theor. Comput. Sci..

[17]  Avi Wigderson,et al.  P = BPP if E requires exponential circuits: derandomizing the XOR lemma , 1997, STOC '97.

[18]  Tom Meyerovitch,et al.  A Characterization of the Entropies of Multidimensional Shifts of Finite Type , 2007, math/0703206.

[19]  Hector Zenil Randomness Through Computation: Some Answers, More Questions , 2011 .

[20]  Verónica Becher,et al.  A polynomial-time algorithm for computing absolutely normal numbers , 2013, Inf. Comput..

[21]  S. S. Pillai,et al.  On normal numbers , 1939 .

[22]  W. Fouché The Descriptive Complexity of Brownian Motion , 2000 .

[23]  David H. Bailey,et al.  On the Random Character of Fundamental Constant Expansions , 2001, Exp. Math..

[24]  Santiago Figueira,et al.  Turing's unpublished algorithm for normal numbers , 2007, Theor. Comput. Sci..

[25]  C. Schnorr Zufälligkeit und Wahrscheinlichkeit , 1971 .

[26]  Leonid A. Levin,et al.  Some theorems on the algorithmic approach to probability theory and information theory: (1971 Dissertation directed by A.N. Kolmogorov) , 2010, Ann. Pure Appl. Log..

[27]  A. Nies Computability and randomness , 2009 .

[28]  A. M. Turing,et al.  Computing Machinery and Intelligence , 1950, The Philosophy of Artificial Intelligence.

[29]  A. Kolmogorov Three approaches to the quantitative definition of information , 1968 .

[30]  Erhard Tornier,et al.  Grundlagen der Wahrscheinlichkeitsrechnung , 1933 .

[31]  Y. Bugeaud Nombres de Liouville et nombres normaux , 2002 .

[32]  W. Sierpinski,et al.  Démonstration élémentaire du théorème de M. Borel sur les nombres absolument normaux et détermination effective d'une tel nombre , 1917 .

[33]  H. Lebesgue Sur certaines démonstrations d'existence , 1917 .

[34]  Ronald de Wolf,et al.  Algorithmic Clustering of Music Based on String Compression , 2004, Computer Music Journal.

[35]  Claus-Peter Schnorr,et al.  Endliche Automaten und Zufallsfolgen , 1972, Acta Informatica.

[36]  Bulletin de la Société Mathématique de France , 2022 .

[37]  Jack H. Lutz,et al.  Finite-State Dimension , 2001, ICALP.

[38]  Claus-Peter Schnorr,et al.  A unified approach to the definition of random sequences , 1971, Mathematical systems theory.

[39]  N. V. Vinodchandran,et al.  Entropy rates and finite-state dimension , 2005, Theor. Comput. Sci..

[40]  Per Martin-Löf,et al.  The Definition of Random Sequences , 1966, Inf. Control..