From Finite-System Entropy to Entropy Rate for a Hidden Markov Process

A recent result presented the expansion for the entropy rate of a hidden Markov process (HMP) as a power series in the noise variable epsi. The coefficients of the expansion around the noiseless (epsi=0) limit were calculated up to 11th order, using a conjecture that relates the entropy rate of an HMP to the entropy of a process of finite length (which is calculated analytically). In this letter, we generalize and prove the conjecture and discuss its theoretical and practical consequences

[1]  Eytan Domany,et al.  Taylor series expansions for the entropy rate of Hidden Markov Processes , 2005, 2006 IEEE International Conference on Communications.

[2]  Tsachy Weissman,et al.  New bounds on the entropy rate of hidden Markov processes , 2004, Information Theory Workshop.

[3]  József Lörinczi,et al.  Transformations of Gibbs measures , 1998 .

[4]  R. Dobrushin,et al.  Completely analytical interactions: Constructive description , 1987 .

[5]  Neri Merhav,et al.  Hidden Markov processes , 2002, IEEE Trans. Inf. Theory.

[6]  Philippe Jacquet,et al.  On the entropy of a hidden Markov process , 2004, Data Compression Conference, 2004. Proceedings. DCC 2004.

[7]  Robert D. Nowak,et al.  Wavelet-based statistical signal processing using hidden Markov models , 1998, IEEE Trans. Signal Process..

[8]  Brian H. Marcus,et al.  Analyticity of entropy rate in families of hidden markov chains , 2005, Proceedings. International Symposium on Information Theory, 2005. ISIT 2005..

[9]  Thomas M. Cover,et al.  Elements of Information Theory , 2005 .

[10]  Sang Joon Kim,et al.  A Mathematical Theory of Communication , 2006 .

[11]  Eytan Domany,et al.  Asymptotics of the entropy rate for a hidden Markov process , 2005, Data Compression Conference.

[12]  Lawrence R. Rabiner,et al.  A tutorial on hidden Markov models and selected applications in speech recognition , 1989, Proc. IEEE.