Analyticity of Entropy Rate of Hidden Markov Chains With Continuous Alphabet

We first prove that under certain mild assumptions, the entropy rate of a hidden Markov chain, observed when passing a finite-state stationary Markov chain through a discrete-time continuous-output channel, is analytic with respect to the input Markov chain parameters. We then further prove, under strengthened assumptions on the channel, that the entropy rate is jointly analytic as a function of both the input Markov chain parameters and the channel parameters. In particular, the main theorems establish the analyticity of the entropy rate for two representative channels: 1) Cauchy and 2) Gaussian. The analyticity results obtained are expected to be helpful in computation/estimation of entropy rate of hidden Markov chains and capacity of finite-state channels with continuous output alphabet.

[1]  Henry D. Pfister,et al.  The Capacity of Finite-State Channels in the High-Noise Regime , 2010, ArXiv.

[2]  Tsachy Weissman,et al.  Bounds on the entropy rate of binary hidden Markov processes , 2011 .

[3]  Brian H. Marcus,et al.  Derivatives of Entropy Rate in Special Families of Hidden Markov Chains , 2007, IEEE Transactions on Information Theory.

[4]  Brian H. Marcus,et al.  Asymptotics of Entropy Rate in Special Families of Hidden Markov Chains , 2010, IEEE Transactions on Information Theory.

[5]  Hans-Andrea Loeliger,et al.  A Generalization of the Blahut–Arimoto Algorithm to Finite-State Channels , 2008, IEEE Transactions on Information Theory.

[6]  Guangyue Han Limit Theorems in Hidden Markov Models , 2013, IEEE Transactions on Information Theory.

[7]  Brian H. Marcus,et al.  Analyticity of Entropy Rate of Hidden Markov Chains , 2005, IEEE Transactions on Information Theory.

[8]  Tsachy Weissman,et al.  Asymptotic filtering and entropy rate of a hidden Markov process in the rare transitions regime , 2005, Proceedings. International Symposium on Information Theory, 2005. ISIT 2005..

[9]  Eytan Domany,et al.  From Finite-System Entropy to Entropy Rate for a Hidden Markov Process , 2006, IEEE Signal Processing Letters.

[10]  H. H. Rugh,et al.  Cones and gauges in complex spaces: Spectral gaps and complex Perron-Frobenius theory , 2006, math/0610354.

[11]  Paul H. Siegel,et al.  On the achievable information rates of finite state ISI channels , 2001, GLOBECOM'01. IEEE Global Telecommunications Conference (Cat. No.01CH37270).

[12]  V B Tadić,et al.  Analyticity, Convergence, and Convergence Rate of Recursive Maximum-Likelihood Estimation in Hidden Markov Models , 2009, IEEE Transactions on Information Theory.

[13]  Brian H. Marcus,et al.  Entropy rate of continuous-state hidden Markov chains , 2010, 2010 IEEE International Symposium on Information Theory.

[14]  Loic Dubois,et al.  Projective metrics and contraction principles for complex cones , 2009 .

[15]  M. Boyle,et al.  HIDDEN MARKOV PROCESSES IN THE CONTEXT OF SYMBOLIC DYNAMICS , 2009, 0907.1858.

[16]  F. Ledrappier Analyticity of the entropy for some random walks , 2010, 1009.5354.

[17]  Guangyue Han,et al.  Concavity of mutual information rate of finite-state channels , 2013, 2013 IEEE International Symposium on Information Theory.

[18]  Armen E. Allahverdyan,et al.  Entropy of Hidden Markov Processes via Cycle Expansion , 2008, ArXiv.

[19]  Paul H. Siegel,et al.  On the capacity of finite state channels and the analysis of convolutional accumulate-m codes , 2003 .

[20]  E. Seneta Non-negative Matrices and Markov Chains , 2008 .

[21]  Eytan Domany,et al.  The Entropy of a Binary Hidden Markov Process , 2005, ArXiv.

[22]  Yuval Peres,et al.  A note on a complex Hilbert metric with application to domain of analyticity for entropy rate of hidden Markov processes , 2009, ArXiv.

[23]  Valerie Isham,et al.  Non‐Negative Matrices and Markov Chains , 1983 .

[24]  L. Arnold,et al.  Evolutionary Formalism for Products of Positive Random Matrices , 1994 .

[25]  Y. Peres Domains of analytic continuation for the top Lyapunov exponent , 1992 .

[26]  Yuval Peres,et al.  Analytic dependence of Lyapunov exponents on transition probabilities , 1991 .

[27]  V. Sharma,et al.  Entropy and channel capacity in the regenerative setup with applications to Markov channels , 2001, Proceedings. 2001 IEEE International Symposium on Information Theory (IEEE Cat. No.01CH37252).

[28]  Yuval Peres,et al.  Entropy Rate for Hidden Markov Chains with rare transitions , 2010, ArXiv.

[29]  F. Gland,et al.  STABILITY AND UNIFORM APPROXIMATION OF NONLINEAR FILTERS USING THE HILBERT METRIC AND APPLICATION TO PARTICLE FILTERS1 , 2004 .

[30]  Tsachy Weissman,et al.  On the optimality of symbol-by-symbol filtering and denoising , 2004, IEEE Transactions on Information Theory.

[31]  Brian H. Marcus,et al.  Asymptotics of Input-Constrained Binary Symmetric Channel Capacity , 2008, ArXiv.

[32]  Venkat Anantharam,et al.  An upper bound for the largest Lyapunov exponent of a Markovian product of nonnegative matrices , 2005, Theor. Comput. Sci..

[33]  Andrea J. Goldsmith,et al.  Capacity of Finite State Channels Based on Lyapunov Exponents of Random Matrices , 2006, IEEE Transactions on Information Theory.

[34]  Tsachy Weissman,et al.  New bounds on the entropy rate of hidden Markov processes , 2004, Information Theory Workshop.

[35]  Shunsuke Ihara,et al.  Information theory - for continuous systems , 1993 .

[36]  Vladimir B. Balakirsky,et al.  On the entropy rate of a hidden Markov model , 2004, International Symposium onInformation Theory, 2004. ISIT 2004. Proceedings..

[37]  John J. Birch Approximations for the Entropy for Functions of Markov Chains , 1962 .

[38]  Philippe Jacquet,et al.  Noisy Constrained Capacity , 2007, 2007 IEEE International Symposium on Information Theory.

[39]  Guangyue Han,et al.  A randomized approach to the capacity of finite-state channels , 2013, 2013 IEEE International Symposium on Information Theory.

[40]  Jack K. Wolf,et al.  On runlength codes , 1988, IEEE Trans. Inf. Theory.

[41]  Jun Luo,et al.  On the Entropy Rate of Hidden Markov Processes Observed Through Arbitrary Memoryless Channels , 2009, IEEE Transactions on Information Theory.

[42]  Philippe Jacquet,et al.  On the entropy of a hidden Markov process , 2008, Theor. Comput. Sci..

[43]  Brian H. Marcus,et al.  Concavity of the Mutual Information Rate for Input-Restricted Memoryless Channels at High SNR , 2012, IEEE Transactions on Information Theory.