Kolmogorov Complexity, Data Compression, and Inference
暂无分享,去创建一个
If a sequence of random variables has Shannon entropy H, it is well known that there exists an efficient description of this sequence which requires only H bits. But the entropy H of a sequence also has to do with inference. Low entropy sequences allow good guesses of their next terms. This is best illustrated by allowing a gambler to gamble at fair odds on such a sequence. The amount of money that one can make is essentially the complement of the entropy with respect to the length of the sequence.
[1] Ray J. Solomonoff,et al. A Formal Theory of Inductive Inference. Part II , 1964, Inf. Control..
[2] L. Levin,et al. THE COMPLEXITY OF FINITE OBJECTS AND THE DEVELOPMENT OF THE CONCEPTS OF INFORMATION AND RANDOMNESS BY MEANS OF THE THEORY OF ALGORITHMS , 1970 .
[3] G. Chaitin. A Theory of Program Size Formally Identical to Information Theory , 1975, JACM.
[4] Thomas M. Cover,et al. Some equivalences between Shannon entropy and Kolmogorov complexity , 1978, IEEE Trans. Inf. Theory.