A Method for Estimating the Entropy of Time Series Using Artificial Neural Networks

Measuring the predictability and complexity of time series using entropy is essential tool designing and controlling a nonlinear system. However, the existing methods have some drawbacks related to the strong dependence of entropy on the parameters of the methods. To overcome these difficulties, this study proposes a new method for estimating the entropy of a time series using the LogNNet neural network model. The LogNNet reservoir matrix is filled with time series elements according to our algorithm. The accuracy of the classification of images from the MNIST-10 database is considered as the entropy measure and denoted by NNetEn. The novelty of entropy calculation is that the time series is involved in mixing the input information in the reservoir. Greater complexity in the time series leads to a higher classification accuracy and higher NNetEn values. We introduce a new time series characteristic called time series learning inertia that determines the learning rate of the neural network. The robustness and efficiency of the method is verified on chaotic, periodic, random, binary, and constant time series. The comparison of NNetEn with other methods of entropy estimation demonstrates that our method is more robust and accurate and can be widely used in practice.

[1]  Paulin Coulibaly,et al.  Sensitivity of Entropy Method to Time Series Length in Hydrometric Network Design , 2017 .

[2]  Antonio Politi,et al.  Permutation entropy revisited , 2018, Chaos, Solitons & Fractals.

[3]  A. Velichko Neural Network for Low-Memory IoT Devices and MNIST Image Recognition Using Kernels Based on Logistic Map , 2020, Electronics.

[4]  Kehui Sun,et al.  Design of a Network Permutation Entropy and Its Applications for Chaotic Time Series and EEG Signals , 2019, Entropy.

[5]  Yi Yin,et al.  Weighted permutation entropy based on different symbolic approaches for financial time series , 2016 .

[6]  Khalid Benabdeslem,et al.  Unsupervised outlier detection for time series by entropy and dynamic time warping , 2018, Knowledge and Information Systems.

[7]  L. M. Saha,et al.  Measuring Chaos: Topological Entropy and Correlation Dimension in Discrete Maps , 2012 .

[8]  G. Litak,et al.  Estimation of a noise level using coarse-grained entropy of experimental time series of internal pressure in a combustion engine , 2004, nlin/0405052.

[9]  Hanif Heidari,et al.  An improved LogNNet classifier for IoT applications , 2021, Journal of Physics: Conference Series.

[10]  Sheng-Fu Liang,et al.  Fast computation of sample entropy and approximate entropy in biomedicine , 2011, Comput. Methods Programs Biomed..

[11]  Alejandro Ramírez-Rojas,et al.  Entropy of geoelectrical time series in the natural time domain , 2011 .

[12]  Y. A. Izotov,et al.  Recognition of handwritten MNIST digits on low-memory 2 Kb RAM Arduino board using LogNNet reservoir neural network , 2021, ArXiv.

[13]  Luciano Zunino,et al.  Permutation entropy based time series analysis: Equalities in the input signal can lead to false conclusions , 2017 .

[14]  S. Guzzetti,et al.  Physiological time-series analysis using approximate entropy and sample entropy , 2000 .

[15]  Mikael Skoglund,et al.  Neural Estimator of Information for Time-Series Data with Dependency , 2021, Entropy.

[16]  Hui Zhang,et al.  Entropy measure for orderable sets , 2021, Inf. Sci..

[17]  D. Cuesta-Frau Permutation entropy: Influence of amplitude information on time series classification performance. , 2019, Mathematical biosciences and engineering : MBE.

[18]  Pengjian Shang,et al.  Permutation transition entropy: Measuring the dynamical complexity of financial time series , 2020 .

[19]  Christophe Letellier,et al.  Estimating the Shannon entropy: recurrence plots versus symbolic dynamics. , 2006, Physical review letters.

[20]  S M Pincus,et al.  Approximate entropy as a measure of system complexity. , 1991, Proceedings of the National Academy of Sciences of the United States of America.

[21]  Hamed Azami,et al.  Amplitude- and Fluctuation-Based Dispersion Entropy , 2018, Entropy.

[22]  Yu Zhang,et al.  Measuring System Entropy with a Deep Recurrent Neural Network Model , 2019, 2019 IEEE 17th International Conference on Industrial Informatics (INDIN).

[23]  Fan Yingle,et al.  Complexity Measure Applied to the Analysis EEG Signals , 2005, 2005 IEEE Engineering in Medicine and Biology 27th Annual Conference.