Kolmogorov Complexity, Data Compression, and Inference

If a sequence of random variables has Shannon entropy H, it is well known that there exists an efficient description of this sequence which requires only H bits. But the entropy H of a sequence also has to do with inference. Low entropy sequences allow good guesses of their next terms. This is best illustrated by allowing a gambler to gamble at fair odds on such a sequence. The amount of money that one can make is essentially the complement of the entropy with respect to the length of the sequence.