KOLMOGOROV'S CONTRIBUTIONS TO INFORMATION THEORY AND ALGORITHMIC COMPLEXITY
暂无分享,去创建一个
[1] Jean-Luc Ville. Étude critique de la notion de collectif , 1939 .
[2] C. E. SHANNON,et al. A mathematical theory of communication , 1948, MOCO.
[3] The Philosophers' Stone , 1957, Nature.
[4] Richard Von Mises,et al. Mathematical Theory of Probability and Statistics , 1966 .
[5] Ray J. Solomonoff,et al. A Formal Theory of Inductive Inference. Part II , 1964, Inf. Control..
[6] Per Martin-Löf,et al. The Definition of Random Sequences , 1966, Inf. Control..
[7] Gregory J. Chaitin,et al. On the Length of Programs for Computing Finite Binary Sequences , 1966, JACM.
[8] S. Kotz. Recent results in information theory , 1966 .
[9] I. Good,et al. Mathematical Theory of Probability and Statistics , 1966 .
[10] Robert B. Ash,et al. Information Theory , 2020, The SAGE International Encyclopedia of Mass Media and Society.
[11] Andrei N. Kolmogorov,et al. Logical basis for information theory and probability theory , 1968, IEEE Trans. Inf. Theory.
[12] Gregory J. Chaitin,et al. On the Length of Programs for Computing Finite Binary Sequences: statistical considerations , 1969, JACM.
[13] D. Ornstein. Bernoulli shifts with the same entropy are isomorphic , 1970 .
[14] L. Levin,et al. THE COMPLEXITY OF FINITE OBJECTS AND THE DEVELOPMENT OF THE CONCEPTS OF INFORMATION AND RANDOMNESS BY MEANS OF THE THEORY OF ALGORITHMS , 1970 .
[15] Claus-Peter Schnorr,et al. Process complexity and effective random tests , 1973 .
[16] Gregory J. Chaitin,et al. Information-Theoretic Limitations of Formal Systems , 1974, JACM.
[17] G. Chaitin. Randomness and Mathematical Proof , 1975 .
[18] G. Chaitin. A Theory of Program Size Formally Identical to Information Theory , 1975, JACM.
[19] J. Aczél. Entropy and ergodic theory , 1975 .
[20] Gregory J. Chaitin,et al. Algorithmic Information Theory , 1987, IBM J. Res. Dev..
[21] Ray J. Solomonoff,et al. Complexity-based induction systems: Comparisons and convergence theorems , 1978, IEEE Trans. Inf. Theory.
[22] Simon Haykin,et al. Communication Systems , 1978 .
[23] R. Gray,et al. Block Synchronization, Sliding-Block Coding, Invulnerable Sources and Zero Error Codes for Discrete Noisy Channels , 1980 .
[24] Wolfgang J. Paul,et al. An Information-Theoretic Approach to Time Bounds for On-Line Computation , 1981, J. Comput. Syst. Sci..
[25] B. Gnedenko,et al. Andrei Nikolaevich Kolmogorov (on his eightieth birthday) , 1983 .
[26] Péter Gács,et al. On the relation between descriptional complexity and algorithmic probability , 1981, 22nd Annual Symposium on Foundations of Computer Science (sfcs 1981).
[27] Leonid A. Levin,et al. Randomness Conservation Inequalities; Information and Independence in Mathematical Theories , 1984, Inf. Control..
[28] A. Barron. THE STRONG ERGODIC THEOREM FOR DENSITIES: GENERALIZED SHANNON-MCMILLAN-BREIMAN THEOREM' , 1985 .
[29] W. Maass. Combinatorial lower bound arguments for deterministic and nondeterministic Turing machines , 1985 .
[30] V. V'yugin. On the Defect of Randomness of a Finite Object with Respect to Measures with Given Complexity Bounds , 1988 .
[31] Ming Li,et al. Tape versus Queue and Stacks: The Lower Bounds , 1988, Inf. Comput..
[32] Ming Li,et al. Kolmogorov Complexity and its Applications , 1991, Handbook of Theoretical Computer Science, Volume A: Algorithms and Complexity.