On the Entropy Rate of Hidden Markov Processes Observed Through Arbitrary Memoryless Channels

This paper studies the entropy rate of hidden Markov processes (HMPs) which are generated by observing a discrete-time binary homogeneous Markov chain through an arbitrary memoryless channel. A fixed-point functional equation is derived for the stationary distribution of an input symbol conditioned on all past observations. While the existence of a solution to the fixed-point functional equation is guaranteed by martingale theory, its uniqueness follows from the fact that the solution is the fixed point of a contraction mapping. The entropy or differential entropy rate of the HMP can then be obtained through computing the average entropy of each input symbol conditioned on past observations. In absence of an analytical solution to the fixed-point functional equation, a numerical method is proposed in which the fixed-point functional equation is first converted to a discrete linear system using uniform quantization and then solved efficiently. The accuracy of the computed entropy rate is shown to be proportional to the quantization interval. Unlike many other numerical methods, this numerical solution is not based on averaging over a sample path of the HMP.

[1]  F. Jones Lebesgue Integration on Euclidean Space , 1993 .

[2]  L. Goddard Information Theory , 1962, Nature.

[3]  K. Athreya,et al.  Measure Theory and Probability Theory , 2006 .

[4]  K. Athreya,et al.  Measure Theory and Probability Theory (Springer Texts in Statistics) , 2006 .

[5]  W. Wonham Some applications of stochastic difierential equations to optimal nonlinear ltering , 1964 .

[6]  Brian H. Marcus,et al.  Analyticity of entropy rate in families of hidden markov chains , 2005, Proceedings. International Symposium on Information Theory, 2005. ISIT 2005..

[7]  Tsachy Weissman,et al.  Asymptotic filtering and entropy rate of a hidden Markov process in the rare transitions regime , 2005, Proceedings. International Symposium on Information Theory, 2005. ISIT 2005..

[8]  Tsachy Weissman,et al.  New bounds on the entropy rate of hidden Markov processes , 2004, Information Theory Workshop.

[9]  Paul H. Siegel,et al.  On the achievable information rates of finite state ISI channels , 2001, GLOBECOM'01. IEEE Global Telecommunications Conference (Cat. No.01CH37270).

[10]  Tsachy Weissman,et al.  Approximations for the entropy rate of a hidden Markov process , 2005, Proceedings. International Symposium on Information Theory, 2005. ISIT 2005..

[11]  Hans-Andrea Loeliger,et al.  On the information rate of binary-input channels with memory , 2001, ICC 2001. IEEE International Conference on Communications. Conference Record (Cat. No.01CH37240).

[12]  Chi Song Wong Approximation to fixed points of generalized nonexpansive mappings , 1976 .

[13]  Vladimir B. Balakirsky,et al.  On the entropy rate of a hidden Markov model , 2004, International Symposium onInformation Theory, 2004. ISIT 2004. Proceedings..

[14]  Neri Merhav,et al.  Hidden Markov processes , 2002, IEEE Trans. Inf. Theory.

[15]  Jaroslav Kožešnk,et al.  Information Theory, Statistical Decision Functions, Random Processes , 1962 .

[16]  Daniel Pérez Palomar,et al.  Representation of Mutual Information Via Input Estimates , 2007, IEEE Transactions on Information Theory.

[17]  Paul H. Siegel,et al.  On the capacity of finite state channels and the analysis of convolutional accumulate-m codes , 2003 .

[18]  Rakesh V. Vohra,et al.  Advanced Mathematical Economics , 2005 .