Permutation Complexity and Coupling Measures in Hidden Markov Models

Recently, the duality between values (words) and orderings (permutations) has been proposed by the authors as a basis to discuss the relationship between information theoretic measures for finite-alphabet stationary stochastic processes and their permutatio nanalogues. It has been used to give a simple proof of the equality between the entropy rate and the permutation entropy rate for any finite-alphabet stationary stochastic process and to show some results on the excess entropy and the transfer entropy for finite-alphabet stationary ergodic Markov processes. In this paper, we extend our previous results to hidden Markov models and show the equalities between various information theoretic complexity and coupling measures and their permutation analogues. In particular, we show the following two results within the realm of hidden Markov models with ergodic internal processes: the two permutation analogues of the transfer entropy, the symbolic transfer entropy and the transfer entropy on rank vectors, are both equivalent to the transfer entropy if they are considered as the rates, and the directed information theory can be captured by the permutation entropy approach.

[1]  Daniel Ray Upper,et al.  Theory and algorithms for hidden Markov models and generalized hidden Markov models , 1998 .

[2]  Karsten Keller,et al.  On the relation of KS entropy and permutation entropy , 2012, 1407.6473.

[3]  Kohei Nakajima,et al.  Permutation Complexity via Duality between Values and Orderings , 2011, ArXiv.

[4]  E. Seneta Non-negative Matrices and Markov Chains , 2008 .

[5]  Brian D. O. Anderson,et al.  The Realization Problem for Hidden Markov Models , 1999, Math. Control. Signals Syst..

[6]  J. Zukas Introduction to the Modern Theory of Dynamical Systems , 1998 .

[7]  Kohei Nakajima,et al.  Symbolic transfer entropy rate is equal to transfer entropy rate for bivariate finite-alphabet stationary ergodic Markov processes , 2011, 1112.2493.

[8]  L M Hively,et al.  Detecting dynamical changes in time series using the permutation entropy. , 2004, Physical review. E, Statistical, nonlinear, and soft matter physics.

[9]  K. Keller,et al.  Permutation entropy: One concept, two approaches , 2013 .

[10]  Kohei Nakajima,et al.  Symbolic local information transfer , 2013, The European Physical Journal Special Topics.

[11]  T. Schreiber,et al.  Information transfer in continuous processes , 2002 .

[12]  Thomas M. Cover,et al.  Elements of information theory (2. ed.) , 2006 .

[13]  Schreiber,et al.  Measuring information transfer , 2000, Physical review letters.

[14]  A. Steele Predictability , 1997, The British journal of ophthalmology.

[15]  B. Pompe,et al.  Momentary information transfer as a coupling measure of time series. , 2011, Physical review. E, Statistical, nonlinear, and soft matter physics.

[16]  Mathieu Sinn,et al.  Kolmogorov-Sinai entropy from the ordinal viewpoint , 2010 .

[17]  Cristopher Moore Complexity in Dynamical Systems , 1991 .

[18]  L. Kocarev,et al.  The permutation entropy rate equals the metric entropy rate for ergodic information sources and ergodic dynamical systems , 2005, nlin/0503044.

[19]  Dimitris Kugiumtzis,et al.  Transfer Entropy on Rank Vectors , 2010, ArXiv.

[20]  J. Crutchfield,et al.  Regularities unseen, randomness observed: levels of entropy convergence. , 2001, Chaos.

[21]  José M. Amigó,et al.  The equality of Kolmogorov–Sinai entropy and metric permutation entropy generalized , 2012 .

[22]  Wentian Li,et al.  On the Relationship between Complexity and Entropy for Markov Chains and Regular Languages , 1991, Complex Syst..

[23]  José Amigó,et al.  Permutation Complexity in Dynamical Systems , 2010 .

[24]  L. Goddard Information Theory , 1962, Nature.

[25]  Charles R. Johnson,et al.  Matrix analysis , 1985, Statistical Inference for Engineers and Data Scientists.

[26]  J. Massey CAUSALITY, FEEDBACK AND DIRECTED INFORMATION , 1990 .

[27]  Gerhard Kramer,et al.  Directed information for channels with feedback , 1998 .

[28]  Karsten Keller,et al.  An approach to comparing Kolmogorov-Sinai and permutation entropy , 2013 .

[29]  Carl S. McTague,et al.  The organization of intrinsic computation: complexity-entropy diagrams and the diversity of natural information processing. , 2008, Chaos.

[30]  Valerie Isham,et al.  Non‐Negative Matrices and Markov Chains , 1983 .

[31]  Robert Shaw,et al.  The Dripping Faucet As A Model Chaotic System , 1984 .

[32]  H Kantz,et al.  Direction of coupling from phases of interacting oscillators: a permutation information approach. , 2008, Physical review letters.

[33]  B. Pompe,et al.  Permutation entropy: a natural complexity measure for time series. , 2002, Physical review letters.

[34]  Dirk V. Arnold,et al.  Information-theoretic Analysis of Phase Transitions , 1996, Complex Syst..

[35]  Thomas M. Cover,et al.  Elements of Information Theory , 2005 .

[36]  Olivier J. J. Michel,et al.  On directed information theory and Granger causality graphs , 2010, Journal of Computational Neuroscience.

[37]  Illtyd Trethowan Causality , 1938 .

[38]  S. Frenzel,et al.  Partial mutual information for coupling analysis of multivariate time series. , 2007, Physical review letters.

[39]  Matthäus Staniek,et al.  Symbolic transfer entropy. , 2008, Physical review letters.

[40]  Kohei Nakajima,et al.  Permutation Excess Entropy and Mutual Information between the Past and Future , 2011, ArXiv.

[41]  M. Mirzakhani,et al.  Introduction to Ergodic theory , 2010 .

[42]  Dimitris Kugiumtzis,et al.  Partial transfer entropy on rank vectors , 2013, ArXiv.

[43]  P. Grassberger Toward a quantitative theory of self-generated complexity , 1986 .

[44]  O A Rosso,et al.  Distinguishing noise from chaos. , 2007, Physical review letters.

[45]  H. Marko,et al.  The Bidirectional Communication Theory - A Generalization of Information Theory , 1973, IEEE Transactions on Communications.

[46]  Karsten Keller Permutations and the Kolmogorov-Sinai entropy , 2011 .

[47]  Kohei Nakajima,et al.  Permutation approach to finite-alphabet stationary stochastic processes based on the duality between values and orderings , 2013 .

[48]  K. Keller,et al.  A standardized approach to the Kolmogorov-Sinai entropy , 2009 .

[49]  G. Keller,et al.  Entropy of interval maps via permutations , 2002 .

[50]  Olivier J. J. Michel,et al.  Relating Granger causality to directed information theory for networks of stochastic processes , 2009, 0911.2873.

[51]  A. U.S.,et al.  Predictability , Complexity , and Learning , 2002 .

[52]  Wolfgang Löhr,et al.  Models of Discrete-Time Stochastic Processes and Associated Complexity Measures , 2009 .