There is no Universal Source Code for Infinite Alphabet

The vast majority of results in information theory is on situations where the actual probability law is known. Applying information theory in real life problems, there is an obvious question whether the probability law can be learned from data as far as information theory is concerned. In noiseless source coding, for example, if the source alphabet is finite, then the answer to this question is yes, since there are good universal source coding procedures (see e.g. [a]). This paper is on coding for a discrete infinite source alphabet showing that there is no universal source code over the class of discrete memoryless sources with infinite source alphabet and finite entropy. Let X be a random variable taking values in X = { 1 , 2 , 3 , . . .} with distribution p and entropy H ( X ) < 03. A discrete memoryless source {Xi} with the marginal distribution p is considered. For a discrete memoryless source let fn be a variable length uniquely decodable code with source block length n. Let the average codeword length of f, be denoted by t,. The redundancy per letter of f , is defined by R, =

[1]  Abraham Lempel,et al.  Compression of individual sequences via variable-rate coding , 1978, IEEE Trans. Inf. Theory.

[2]  Lee D. Davisson,et al.  Universal noiseless coding , 1973, IEEE Trans. Inf. Theory.