Algorithmically probable mutations reproduce aspects of evolution such as convergence rate, genetic memory, modularity, diversity explosions, and mass extinction

Natural selection explains how life has evolved over millions of years from more primitive forms. The speed at which this happens, however, has sometimes defied explanations based on random (uniformly distributed) mutations. Here we investigate the application of algorithmic mutations (no recombination) to binary matrices drawn from numerical approximations to algorithmic probability in order to compare evolutionary convergence rates against the null hypothesis (uniformly distributed mutations). Results both on synthetic and a small biological examples lead to an accelerated rate of convergence when using the algorithmic probability. We also show that algorithmically evolved modularity provides an advantage that produces a genetic memory. We demonstrate that regular structures are preserved and carried on when they first occur and can lead to an accelerated production of diversity and extinction, possibly explaining naturally occurring phenomena such as diversity explosions (e.g. the Cambrian) and massive extinctions (e.g. the End Triassic) whose causes have eluded researchers and are a cause for debate. The approach introduced here appears to be a better approximation to biological evolution than models based exclusively upon random uniform mutations, and it also approaches better a formal version of open-ended evolution based on previous results. The results validate the motivations and results of Chaitin's Metabiology programme and previous suggestions that computation may be an equally important driver of evolution together, and even before, the action and result of natural selection. We also show that inducing the method on problems of optimization, such as genetic algorithms, has the potential to accelerate convergence of artificial evolutionary algorithms.

[1]  Hector Zenil,et al.  Low Algorithmic Complexity Entropy-deceiving Graphs , 2016, Physical review. E.

[2]  Gregory Chaitin,et al.  Life as Evolving Software , 2012 .

[3]  D. Hartl,et al.  Principles of population genetics , 1981 .

[4]  Lars Kaderali,et al.  Dynamic probabilistic threshold networks to infer signaling pathways from time-course perturbation data , 2014, BMC Bioinformatics.

[5]  Cristian S. Calude,et al.  Computing a Glimpse of Randomness , 2002, Exp. Math..

[6]  Jean-Paul Delahaye,et al.  On the Algorithmic Nature of the World , 2009, ArXiv.

[7]  R. Solomonoff The Kolmogorov Lecture* The Universal Distribution and Machine Learning , 2003, Comput. J..

[8]  Hector Zenil,et al.  Ubiquity symposium: Evolutionary computation and the processes of life: some computational aspects of essential properties of evolution and life , 2012, UBIQ.

[9]  T. Ideker,et al.  Integrative approaches for finding modular structure in biological networks , 2013, Nature Reviews Genetics.

[10]  Sang Joon Kim,et al.  A Mathematical Theory of Communication , 2006 .

[11]  Y. Yarden,et al.  Untangling the ErbB signalling network , 2001, Nature Reviews Molecular Cell Biology.

[12]  Hector Zenil,et al.  A Computable Measure of Algorithmic Probability by Finite Approximations , 2015, Complex..

[13]  Jean-Paul Delahaye,et al.  Calculating Kolmogorov Complexity from the Output Frequency Distributions of Small Turing Machines , 2012, PloS one.

[14]  Hector Zenil,et al.  Methods of information theory and algorithmic complexity for network biology. , 2014, Seminars in cell & developmental biology.

[15]  Gregory J. Chaitin,et al.  On the Length of Programs for Computing Finite Binary Sequences , 1966, JACM.

[16]  A. Turing On Computable Numbers, with an Application to the Entscheidungsproblem. , 1937 .

[17]  Hector Zenil,et al.  Correlation of automorphism group size and topological properties with program−size complexity evaluations of graphs and complex networks , 2013, 1306.0322.

[18]  Monilola A. Olayioye,et al.  The ErbB signaling network: receptor heterodimerization in development and cancer , 2000, The EMBO journal.

[19]  Jean-Paul Delahaye,et al.  Two-Dimensional Kolmogorov Complexity and Validation of the Coding Theorem Method by Compressibility , 2012, ArXiv.

[20]  Paul M. B. Vitányi,et al.  The miraculous universal distribution , 1997 .

[21]  Jeffrey Shallit,et al.  Proving Darwin: Making Biology Mathematical , 2012 .

[22]  Hector Zenil,et al.  A Decomposition Method for Global Evaluation of Shannon Entropy and Local Estimations of Algorithmic Complexity , 2016, Entropy.

[23]  Ray J. Solomonoff,et al.  A Formal Theory of Inductive Inference. Part I , 1964, Inf. Control..

[24]  Gregory J. Chaitin Evolution of Mutating Software , 2009, Bull. EATCS.

[25]  Larry J. Eshelman,et al.  On Crossover as an Evolutionarily Viable Strategy , 1991, ICGA.

[26]  H. Herzel,et al.  Estimating the entropy of DNA sequences. , 1997, Journal of theoretical biology.

[27]  Jean-Paul Delahaye,et al.  Numerical evaluation of algorithmic complexity for short strings: A glance into the innermost structure of randomness , 2011, Appl. Math. Comput..

[28]  Stephen Wolfram,et al.  A New Kind of Science , 2003, Artificial Life.

[29]  Hector Zenil,et al.  Life as Thermodynamic Evidence of Algorithmic Structure in Natural Environments , 2012, Entropy.

[30]  Gregory J. Chaitin,et al.  Information-Theoretic Limitations of Formal Systems , 1974, JACM.

[31]  A. Kolmogorov Three approaches to the quantitative definition of information , 1968 .

[32]  Hector Zenil,et al.  The Limits of Decidable States on Open-Ended Evolution and Emergence , 2016, ALIFE.

[33]  Mark A. Bedau,et al.  Four Puzzles About Life , 1998, Artificial Life.