Relative Consistency of Sample Entropy Is Not Preserved in MIX Processes

Relative consistency is a notion related to entropic parameters, most notably to Approximate Entropy and Sample Entropy. It is a central characteristic assumed for e.g., biomedical and economic time series, since it allows the comparison between different time series at a single value of the threshold parameter r. There is no formal proof for this property, yet it is generally accepted that it is true. Relative consistency in both Approximate Entropy and Sample entropy was first tested with the MIX process. In the seminal paper by Richman and Moorman, it was shown that Approximate Entropy lacked the property for cases in which Sample Entropy did not. In the present paper, we show that relative consistency is not preserved for MIX processes if enough noise is added, yet it is preserved for another process for which we define a sum of a sinusoidal and a stochastic element, no matter how much noise is present. The analysis presented in this paper is only possible because of the existence of the very fast NCM algorithm for calculating correlation sums and thus also Sample Entropy.

[1]  D. Ruelle,et al.  Ergodic theory of chaos and strange attractors , 1985 .

[2]  A L Goldberger,et al.  Physiological time-series analysis: what does regularity quantify? , 1994, The American journal of physiology.

[3]  James P. Crutchfield,et al.  Geometry from a Time Series , 1980 .

[4]  P. Castiglioni,et al.  How the threshold “r” influences approximate entropy analysis of heart-rate variability , 2008, 2008 Computers in Cardiology.

[5]  Madalena Costa,et al.  Multiscale entropy analysis of biological signals. , 2005, Physical review. E, Statistical, nonlinear, and soft matter physics.

[6]  Sebastian Zurek,et al.  On the relation between correlation dimension, approximate entropy and sample entropy parameters, and a fast algorithm for their calculation , 2012 .

[7]  Nathaniel H. Hunt,et al.  The Appropriate Use of Approximate Entropy and Sample Entropy with Short Data Sets , 2012, Annals of Biomedical Engineering.

[8]  Robert C. Hilborn,et al.  Chaos And Nonlinear Dynamics: An Introduction for Scientists and Engineers , 1994 .

[9]  John McCamley,et al.  Effect of parameter selection on entropy calculation for long walking trials. , 2018, Gait & posture.

[10]  William Denton,et al.  Sampling frequency influences sample entropy of kinematics during walking , 2018, Medical & Biological Engineering & Computing.

[11]  S. Pincus Approximate entropy (ApEn) as a complexity measure. , 1995, Chaos.

[12]  Wangxin Yu,et al.  Characterization of Surface EMG Signal Based on Fuzzy Entropy , 2007, IEEE Transactions on Neural Systems and Rehabilitation Engineering.

[13]  Shoushui Wei,et al.  Determination of Sample Entropy and Fuzzy Measure Entropy Parameters for Distinguishing Congestive Heart Failure from Normal Sinus Rhythm Subjects , 2015, Entropy.

[14]  S M Pincus,et al.  Approximate entropy as a measure of system complexity. , 1991, Proceedings of the National Academy of Sciences of the United States of America.

[15]  F. Takens Detecting strange attractors in turbulence , 1981 .

[16]  J. Richman,et al.  Physiological time-series analysis using approximate entropy and sample entropy. , 2000, American journal of physiology. Heart and circulatory physiology.

[17]  David Cuesta-Frau,et al.  Comparative study of approximate entropy and sample entropy robustness to spikes , 2011, Artif. Intell. Medicine.