A Theory of Program Size Formally Identical to Information Theory

A new definition of program-size complexity is made. H(A,B/C,D) is defined to be the size in bits of the shortest self-delimiting program for calculating strings A and B if one is given a minimal-size self-delimiting program for calculating strings C and D. This differs from previous definitions: (1) programs are required to be self-delimiting, i.e. no program is a prefix of another, and (2) instead of being given C and D directly, one is given a program for calculating them that is minimal in size. Unlike previous definitions, this one has precisely the formal properties of the entropy concept of information theory. For example, H(A,B) = H(A) + H(B/A) -~ 0(1). Also, if a program of length k is assigned measure 2 -k, then H(A) = -log2 (the probability that the standard universal computer will calculate A) -{- 0(1).

[1]  Mill Johannes G.A. Van,et al.  Transmission Of Information , 1961 .

[2]  Amiel Feinstein,et al.  Transmission of Information. , 1962 .

[3]  Norman Abramson,et al.  Information theory and coding , 1963 .

[4]  Ray J. Solomonoff,et al.  A Formal Theory of Inductive Inference. Part I , 1964, Inf. Control..

[5]  Ray J. Solomonoff,et al.  A Formal Theory of Inductive Inference. Part II , 1964, Inf. Control..

[6]  Gregory J. Chaitin,et al.  On the Length of Programs for Computing Finite Binary Sequences , 1966, JACM.

[7]  Robert B. Ash,et al.  Information Theory , 2020, The SAGE International Encyclopedia of Mass Media and Society.

[8]  Marvin Minsky,et al.  Computation : finite and infinite machines , 2016 .

[9]  A. Kolmogorov Three approaches to the quantitative definition of information , 1968 .

[10]  Donald W. Loveland,et al.  A Variant of the Kolmogorov Concept of Complexity , 1969, Information and Control.

[11]  Gregory J. Chaitin,et al.  On the Length of Programs for Computing Finite Binary Sequences: statistical considerations , 1969, JACM.

[12]  Marvin Minsky,et al.  Perceptrons: An Introduction to Computational Geometry , 1969 .

[13]  Alfréd Rényi,et al.  Foundations of Probability , 1971 .

[14]  Gregory J. Chaitin,et al.  On the difficulty of computations , 1970, IEEE Trans. Inf. Theory.

[15]  L. Levin,et al.  THE COMPLEXITY OF FINITE OBJECTS AND THE DEVELOPMENT OF THE CONCEPTS OF INFORMATION AND RANDOMNESS BY MEANS OF THE THEORY OF ALGORITHMS , 1970 .

[16]  David G. Willis,et al.  Computational Complexity and Probability Constructions , 1970, JACM.

[17]  B. Weiss The isomorphism problem in ergodic theory , 1972 .

[18]  Claus-Peter Schnorr,et al.  The process complexity and effective random tests. , 1972, STOC.

[19]  T. Cover On Determining the Irrationality of the Mean of a Random Variable , 1973 .

[20]  Charles H. Bennett,et al.  Logical reversibility of computation , 1973 .

[21]  Jacob T. Schwartz,et al.  On programming : an interim report on the SETL Project , 1973 .

[22]  Claus-Peter Schnorr,et al.  Process complexity and effective random tests , 1973 .

[23]  T. Fine Theories of Probability: An Examination of Foundations , 1973 .

[24]  M. Levin,et al.  MATHEMATICAL LOGIC FOR COMPUTER SCIENTISTS , 1974 .

[25]  Gregory J. Chaitin,et al.  Information-Theoretic Limitations of Formal Systems , 1974, JACM.

[26]  Robert P. Daley The Extent and Density of Sequences within the Minimal-Program Complexity Hierarchies , 1974, J. Comput. Syst. Sci..

[27]  Gregory J. Chaitin,et al.  Information-Theoretic Computational Complexity , 1974 .

[28]  Peter Elias Minimum Times and Memories Needed to Compute the Values of a Function , 1974, J. Comput. Syst. Sci..

[29]  G. Chaitin Randomness and Mathematical Proof , 1975 .

[30]  Peter Elias,et al.  Universal codeword sets and representations of the integers , 1975, IEEE Trans. Inf. Theory.

[31]  Gregory J. Chaitin Information-Theoretic Characterizations of Recursive Infinite Strings , 1976, Theor. Comput. Sci..