Aspects of Time Series Analysis with Entropies and Complexity Measures

This exploratory work looks on time series complexity, for direct applications in change detection of the structure and properties of the signals. The objective of the paper was to compare the results of the evaluation of the complexity of time series data, obtained with measures based on entropy and data complexity indices. As test signals, determinist, random and chaotic signals are considered, in an independent and mixed probabilistic approach. Complexity descriptors are based on entropies, as Renyi, Tsallis, Multiscale technique, and two data complexity indices, as Lempel-Ziv complexity and Lyapunov exponent. High values of complexity measures are expected for all cases where random or chaotic components are dominant, i.e. greater amplitudes than the determinist components. The complexity measures are evaluated in terms of monotony, sensitivity at the length of time series, and change detection capability of the structure of the analyzed signal. The results of the experiments based on computer-based simulations are presented with fuzzy labels and show a variety of results, i.e. good results for some cases and small for others. Such results suggest an aggregate criterion for change detection, with at least two terms, one based on entropy and another one on complexity indices.

[1]  E. Lorenz Deterministic nonperiodic flow , 1963 .

[2]  S M Pincus,et al.  Approximate entropy as a measure of system complexity. , 1991, Proceedings of the National Academy of Sciences of the United States of America.

[3]  Alan V. Oppenheim,et al.  Chaotic Signals and Signal Processing , 1999 .

[4]  J. Richman,et al.  Physiological time-series analysis using approximate entropy and sample entropy. , 2000, American journal of physiology. Heart and circulatory physiology.

[5]  Olivier J. J. Michel,et al.  Measuring time-Frequency information content using the Rényi entropies , 2001, IEEE Trans. Inf. Theory.

[6]  S. Lloyd,et al.  Measures of complexity: a nonexhaustive list , 2001 .

[7]  Joydeep Bhattacharya,et al.  An index of signal mode complexity based on orthogonal transformation , 2010, Journal of Computational Neuroscience.

[8]  James P. Crutchfield,et al.  Anatomy of a Bit: Information in a Time Series Observation , 2011, Chaos.

[9]  Jiann-Shing Shieh,et al.  An Approach of Multiscale Complexity in Texture Analysis of Lymphomas , 2011, IEEE Signal Processing Letters.

[10]  Anne Humeau-Heurtier,et al.  Multiscale Entropy Study of Medical Laser Speckle Contrast Images , 2013, IEEE Transactions on Biomedical Engineering.

[11]  K. Wiesner,et al.  What is a complex system? , 2012, European Journal for Philosophy of Science.

[12]  Eamonn J. Keogh,et al.  CID: an efficient complexity-invariant distance for time series , 2013, Data Mining and Knowledge Discovery.

[13]  Guanghua Xu,et al.  Rolling bearing quality evaluation based on a morphological filter and a Kolmogorov complexity measure , 2015 .

[14]  N. Nagaraj,et al.  Three perspectives on complexity: entropy, compression, subsymmetry , 2016, The European Physical Journal Special Topics.

[15]  Shubin Si,et al.  The Entropy Algorithm and Its Variants in the Fault Diagnosis of Rotating Machinery: A Review , 2018, IEEE Access.

[16]  Hugo Leonardo Rufiner,et al.  Complexity-Based Discrepancy Measures Applied to Detection of Apnea-Hypopnea Events , 2018, Complex..

[17]  Jie Xiang,et al.  Complexity Analysis of EEG, MEG, and fMRI in Mild Cognitive Impairment and Alzheimer’s Disease: A Review , 2020, Entropy.

[18]  Yu Zhang,et al.  Entropy Measures in Machine Fault Diagnosis: Insights and Applications , 2020, IEEE Transactions on Instrumentation and Measurement.