Ultimate Intelligence Part II: Physical Measure and Complexity of Intelligence

We continue our analysis of volume and energy measures that are appropriate for quantifying inductive inference systems. We extend logical depth and conceptual jump size measures in AIT to stochastic problems, and physical measures that involve volume and energy. We introduce a graphical model of computational complexity that we believe to be appropriate for intelligent machines. We show several asymptotic relations between energy, logical depth and volume of computation for inductive inference. In particular, we arrive at a "black-hole equation" of inductive inference, which relates energy, volume, space, and algorithmic information for an optimal inductive inference solution. We introduce energy-bounded algorithmic entropy. We briefly apply our ideas to the physical limits of intelligent computation in our universe.

[1]  Ray J. Solomonoff,et al.  Three Kinds of Probabilistic Induction: Universal Distributions and Convergence Theorems , 2008, Comput. J..

[2]  Laurent Orseau,et al.  Delusion, Survival, and Intelligent Agents , 2011, AGI.

[3]  Naftali Tishby,et al.  The information bottleneck method , 2000, ArXiv.

[4]  N. Margolus,et al.  The maximum speed of dynamical evolution , 1997, quant-ph/9710043.

[5]  Turlough Neary,et al.  The complexity of small universal Turing machines: A survey , 2009, Theor. Comput. Sci..

[6]  Karl J. Friston Life as we know it , 2013, Journal of The Royal Society Interface.

[7]  Rolf Herken,et al.  The Universal Turing Machine: A Half-Century Survey , 1992 .

[8]  R. Landauer,et al.  Irreversibility and heat generation in the computing process , 1961, IBM J. Res. Dev..

[9]  W. Ross Ashby,et al.  Principles of the Self-Organizing System , 1991 .

[10]  Karl J. Friston A Free Energy Principle for Biological Systems , 2012, Entropy.

[11]  H. Von Foerster,et al.  Principles of Self-Organization: Transactions of the University of Illinois Symposium , 1962 .

[12]  Leonid A. Levin,et al.  Some theorems on the algorithmic approach to probability theory and information theory: (1971 Dissertation directed by A.N. Kolmogorov) , 2010, Ann. Pure Appl. Log..

[13]  S. Lloyd Computational capacity of the universe. , 2001, Physical review letters.

[14]  Daniel B. Miller,et al.  Two-state, reversible, universal cellular automata in three dimensions , 2005, CF '05.

[15]  Shane Legg,et al.  Universal Intelligence: A Definition of Machine Intelligence , 2007, Minds and Machines.

[16]  Leonid A. Levin,et al.  Randomness Conservation Inequalities; Information and Independence in Mathematical Theories , 1984, Inf. Control..

[17]  R. Solomonoff Progress In Incremental Machine Learning , 2003 .

[18]  Ray J. Solomonoff,et al.  A Formal Theory of Inductive Inference. Part II , 1964, Inf. Control..

[19]  Gregory J. Chaitin,et al.  Algorithmic Information Theory , 1987, IBM J. Res. Dev..

[20]  Hans J. Bremermann,et al.  Minimum energy requirements of information transfer and computing , 1982 .

[21]  W. Ashby,et al.  Principles of the self-organizing dynamic system. , 1947, The Journal of general psychology.

[22]  S. Lloyd Ultimate physical limits to computation , 1999, Nature.

[23]  Ray J. Solomonoff,et al.  Complexity-based induction systems: Comparisons and convergence theorems , 1978, IEEE Trans. Inf. Theory.

[24]  Ray J. Solomonofi,et al.  A SYSTEM FOR INCREMENTAL LEARNING BASED ON ALGORITHMIC PROBABILITY , 1989 .

[25]  Jürgen Schmidhuber,et al.  Optimal Ordered Problem Solver , 2002, Machine Learning.

[26]  Shane Legg,et al.  An Approximation of the Universal Intelligence Measure , 2011, Algorithmic Probability and Friends.

[27]  Charles H. Bennett Logical depth and physical complexity , 1988 .

[28]  Marcus Hutter,et al.  Universal Algorithmic Intelligence: A Mathematical Top→Down Approach , 2007, Artificial General Intelligence.

[29]  Ray J. Solomonoff,et al.  Algorithmic Probability, Heuristic Programming and AGI , 2010, AGI 2010.