Algorithmic Information Theory

This paper reviews algorithmic information theory, which is an attempt to apply information-theoretic and probabilistic ideas to recursive function theory. Typical concerns in this approach are, for example, the number of bits of information required to specify an algorithm, or the probability that a program whose bits are chosen by coin flipping produces a given output. During the past few years the definitions of algorithmic information theory have been reformulated. The basic features of the new formalism are presented here and certain results of R. M. Solovay are reported.

[1]  J. Schwartz,et al.  A note on monte carlo primality tests and algorithmic information theory , 1978 .

[2]  Ray J. Solomonoff,et al.  Complexity-based induction systems: Comparisons and convergence theorems , 1978, IEEE Trans. Inf. Theory.

[3]  Martin Davis What is a Computation , 1978 .

[4]  Volker Strassen,et al.  A Fast Monte-Carlo Test for Primality , 1977, SIAM J. Comput..

[5]  R. M. Solovay On Random R, E. Sets , 1977 .

[6]  N. Costa,et al.  Non-Classical Logics, Model Theory, and Computability: Proceedings of the Third Latin-American Symposium on Mathematical Logic, Campinas, Brazil, July 11-17, 1976 , 1977 .

[7]  G. Chaitin Program size, oracles, and the jump operation , 1977 .

[8]  Jozef Gruska,et al.  Descriptional Complexity (of Languages) - A Short Survey , 1976, MFCS.

[9]  Gregory J. Chaitin Information-Theoretic Characterizations of Recursive Infinite Strings , 1976, Theor. Comput. Sci..

[10]  Robert P. Daley Non-Complex Sequences: Characterizations and Examples , 1974, SWAT.

[11]  Gregory J. Chaitin,et al.  ALGORITHMIC ENTROPY OF SETS , 1976 .

[12]  G. Chaitin A Theory of Program Size Formally Identical to Information Theory , 1975, JACM.

[13]  G. Chaitin Randomness and Mathematical Proof , 1975 .

[14]  William Lawrence Gewirtz,et al.  Investigations in the Theory of Descriptive Complexity , 2015 .

[15]  Gregory J. Chaitin,et al.  Information-Theoretic Limitations of Formal Systems , 1974, JACM.

[16]  M. Levin,et al.  MATHEMATICAL LOGIC FOR COMPUTER SCIENTISTS , 1974 .

[17]  Gregory J. Chaitin,et al.  Information-Theoretic Computational Complexity , 1974 .

[18]  Gregory J. Chaitin,et al.  Information-theoretic computation complexity , 1974, IEEE Trans. Inf. Theory.

[19]  Charles H. Bennett,et al.  Logical reversibility of computation , 1973 .

[20]  Claus-Peter Schnorr,et al.  Process complexity and effective random tests , 1973 .

[21]  H. Feistel Cryptography and Computer Privacy , 1973 .

[22]  T. Kamae On Kolmogorov's complexity and information , 1973 .

[23]  T. Fine Theories of Probability: An Examination of Foundations , 1973 .

[24]  Jacob T. Schwartz,et al.  On programming : an interim report on the SETL Project , 1973 .

[25]  P. Martin-Löf Complexity oscillations in infinite binary sequences , 1971 .

[26]  L. Levin,et al.  THE COMPLEXITY OF FINITE OBJECTS AND THE DEVELOPMENT OF THE CONCEPTS OF INFORMATION AND RANDOMNESS BY MEANS OF THE THEORY OF ALGORITHMS , 1970 .

[27]  R. Solovay A model of set-theory in which every set of reals is Lebesgue measurable* , 1970 .

[28]  David G. Willis,et al.  Computational Complexity and Probability Constructions , 1970, JACM.

[29]  P. Martin-Löf On the Notion of Randomness , 1970 .

[30]  G. J. Chaltin,et al.  To a mathematical definition of 'life' , 1970, SIGA.

[31]  Gregory J. Chaitin,et al.  On the difficulty of computations , 1970, IEEE Trans. Inf. Theory.

[32]  Donald W. Loveland,et al.  A Variant of the Kolmogorov Concept of Complexity , 1969, Information and Control.

[33]  Gregory J. Chaitin,et al.  On the Simplicity and Speed of Programs for Computing Infinite Sets of Natural Numbers , 1969, J. ACM.

[34]  P. Martin-Löf,et al.  Algorithms and Randomness , 1969 .

[35]  Gregory J. Chaitin,et al.  On the Length of Programs for Computing Finite Binary Sequences: statistical considerations , 1969, JACM.

[36]  Andrei N. Kolmogorov,et al.  Logical basis for information theory and probability theory , 1968, IEEE Trans. Inf. Theory.

[37]  C. S. Wallace,et al.  An Information Measure for Classification , 1968, Comput. J..

[38]  A. Kolmogorov Three approaches to the quantitative definition of information , 1968 .

[39]  Donald Ervin Knuth,et al.  The Art of Computer Programming , 1968 .

[40]  Per Martin-Löf,et al.  The Definition of Random Sequences , 1966, Inf. Control..

[41]  Gregory J. Chaitin,et al.  On the Length of Programs for Computing Finite Binary Sequences , 1966, JACM.

[42]  D. Loveland The Kleene hierarchy classification of recursively random sequences , 1966 .

[43]  D. Loveland A New Interpretation of the von Mises' Concept of Random Sequence† , 1966 .

[44]  John von Neumann,et al.  Theory Of Self Reproducing Automata , 1967 .

[45]  Ray J. Solomonoff,et al.  A Formal Theory of Inductive Inference. Part II , 1964, Inf. Control..

[46]  A. Church On the concept of a random sequence , 1940 .

[47]  G. Doetsch RICHARD V. MISES, Professor an der Universität Berlin, Wahrscheinlichkeit, Statistik und Wahrheit. Schriften zur wissenschaftlichen Weltauffassung Bd. 3. Verlag von Julius Springer, Wien 1928. VII + 189 S. Preis 9,60 M . , 2022 .

[48]  Richard Von Mises,et al.  Wahrscheinlichkeit, Statistik und Wahrheit. , 1936 .

[49]  R. Mises Grundlagen der Wahrscheinlichkeitsrechnung , 1919 .

[50]  B. Russell Mathematical Logic as Based on the Theory of Types , 1908 .