Synchronizing to Periodicity: the Transient Information and Synchronization Time of Periodic Sequences

We analyze how difficult it is to synchronize to a periodic sequence whose structure is known, when an observer is initially unaware of the sequence's phase. We examine the transient information T, a recently introduced information-theoretic quantity that measures the uncertainty an observer experiences while synchronizing to a sequence. We also consider the synchronization time τ, which is the average number of measurements required to infer the phase of a periodic signal. We calculate T and τ for all periodic sequences up to and including period 23. We show which sequences of a given period have the maximum and minimum possible T and τ values, develop analytic expressions for the extreme values, and show that in these cases the transient information is the product of the total phase information and the synchronization time. Despite the latter result, our analyses demonstrate that the transient information and synchronization time capture different and complementary structural properties of individual periodic sequences — properties, moreover, that are distinct from source entropy rate and mutual information measures, such as the excess entropy.

[1]  Werner Ebeling,et al.  Prediction and entropy of nonlinear dynamical systems and symbolic sequences with LRO , 1997 .

[2]  Wentian Li,et al.  On the Relationship between Complexity and Entropy for Markov Chains and Regular Languages , 1991, Complex Syst..

[3]  J. Crutchfield The calculi of emergence: computation, dynamics and induction , 1994 .

[4]  C. E. SHANNON,et al.  A mathematical theory of communication , 1948, MOCO.

[5]  A. Steele Predictability , 1997, The British journal of ophthalmology.

[6]  J. Crutchfield,et al.  Regularities unseen, randomness observed: levels of entropy convergence. , 2001, Chaos.

[7]  N. J. A. Sloane,et al.  The On-Line Encyclopedia of Integer Sequences , 2003, Electron. J. Comb..

[8]  Mats G. Nordahl,et al.  Complexity Measures and Cellular Automata , 1988, Complex Syst..

[9]  R. Gray Entropy and Information Theory , 1990, Springer New York.

[10]  Sang Joon Kim,et al.  A Mathematical Theory of Communication , 2006 .

[11]  Thomas M. Cover,et al.  Elements of Information Theory , 2005 .

[12]  N. Packard,et al.  Symbolic dynamics of noisy chaos , 1983 .

[13]  James P. Crutchfield,et al.  Computational Mechanics: Pattern and Prediction, Structure and Simplicity , 1999, ArXiv.

[14]  Robert Shaw,et al.  The Dripping Faucet As A Model Chaotic System , 1984 .

[15]  P. Grassberger Toward a quantitative theory of self-generated complexity , 1986 .

[16]  E. Gilbert,et al.  Symmetry types of periodic sequences , 1961 .

[17]  Ilya Nemenman,et al.  Information theory and learning: a physical approach , 2000, ArXiv.

[18]  James P. Crutchfield,et al.  Synchronizing to the Environment: Information-Theoretic Constraints on Agent Learning , 2001, Adv. Complex Syst..

[19]  N. J. Fine,et al.  Classes of periodic sequences , 1958 .

[20]  Young,et al.  Inferring statistical complexity. , 1989, Physical review letters.

[21]  A. U.S.,et al.  Predictability , Complexity , and Learning , 2002 .

[22]  Naftali Tishby,et al.  Predictability, Complexity, and Learning , 2000, Neural Computation.

[23]  Robert C. Titsworth Equivalence classes of periodic sequences , 1964 .

[24]  J. Crutchfield,et al.  Structural information in two-dimensional patterns: entropy convergence and excess entropy. , 2002, Physical review. E, Statistical, nonlinear, and soft matter physics.