On Hidden Markov Processes with Infinite Excess Entropy

We investigate stationary hidden Markov processes for which mutual information between the past and the future is infinite. It is assumed that the number of observable states is finite and the number of hidden states is countably infinite. Under this assumption, we show that the block mutual information of a hidden Markov process is upper bounded by a power law determined by the tail index of the hidden state distribution. Moreover, we exhibit three examples of processes. The first example, considered previously, is nonergodic and the mutual information between the blocks is bounded by the logarithm of the block length. The second example is also nonergodic but the mutual information between the blocks obeys a power law. The third example obeys the power law and is ergodic.

[1]  On the covariance determinants of moving-average and autoregressive models , 1960 .

[2]  J. Crutchfield,et al.  Structural information in two-dimensional patterns: entropy convergence and excess entropy. , 2002, Physical review. E, Statistical, nonlinear, and soft matter physics.

[3]  W. Ebeling,et al.  Entropy and Long-Range Correlations in Literary English , 1993, cond-mat/0204108.

[4]  B. Kamiński ON EXCESS ENTROPIES FOR STATIONARY RANDOM FIELDS∗ , 2009 .

[5]  U. Grenander,et al.  Toeplitz Forms And Their Applications , 1958 .

[6]  Lukasz Dkebowski Variable-Length Coding of Two-Sided Asymptotically Mean Stationary Measures , 2009 .

[7]  Gramss Entropy of the symbolic sequence for critical circle maps. , 1994, Physical review. E, Statistical physics, plasmas, fluids, and related interdisciplinary topics.

[8]  Raymond W. Yeung,et al.  A First Course in Information Theory , 2002 .

[9]  J. Crutchfield,et al.  Regularities unseen, randomness observed: levels of entropy convergence. , 2001, Chaos.

[10]  Werner Ebeling,et al.  Prediction and entropy of nonlinear dynamical systems and symbolic sequences with LRO , 1997 .

[11]  W. Hilberg,et al.  Der bekannte Grenzwert der redundanzfreien Information in Texten - eine Fehlinterpretation der Shannonschen Experimente? , 1990 .

[12]  James P. Crutchfield,et al.  Infinite Excess Entropy Processes with Countable-State Generators , 2011, Entropy.

[13]  Lukasz Debowski,et al.  A general definition of conditional information and its application to ergodic decomposition , 2009 .

[14]  I. BrunonKAM ON EXCESS ENTROPIES FOR STATIONARY RANDOM FIELDS , 2009 .

[15]  Young,et al.  Inferring statistical complexity. , 1989, Physical review letters.

[16]  GUSTAV HERDAN QUANTITATIVE LINGUISTICS OR GENERATIVE GRAMMAR? , 1964 .

[17]  James P. Crutchfield,et al.  Computational Mechanics: Pattern and Prediction, Structure and Simplicity , 1999, ArXiv.

[18]  On the strong mixing and weak Bernoulli conditions , 1980 .

[19]  Neri Merhav,et al.  Hidden Markov processes , 2002, IEEE Trans. Inf. Theory.

[20]  Lukasz Dkebowski,et al.  Excess entropy in natural language: present state and perspectives , 2011, Chaos.

[21]  Carl de Marcken,et al.  Unsupervised language acquisition , 1996, ArXiv.

[22]  Lukasz Debowski,et al.  On the Vocabulary of Grammar-Based Codes and the Logical Consistency of Texts , 2008, IEEE Transactions on Information Theory.

[23]  James P. Crutchfield,et al.  Prediction, Retrodiction, and the Amount of Information Stored in the Present , 2009, ArXiv.

[24]  G. Zipf The Psycho-Biology Of Language: AN INTRODUCTION TO DYNAMIC PHILOLOGY , 1999 .

[25]  James P. Crutchfield,et al.  Information accessibility and cryptic processes , 2009, 0905.4787.

[26]  Naftali Tishby,et al.  Complexity through nonextensivity , 2001, physics/0103076.

[27]  Wolfgang Löhr Properties of the Statistical Complexity Functional and Partially Deterministic HMMs , 2009, Entropy.