Some Convex Functions Based Measures of Independence and Their Application to Strange Attractor Reconstruction

The classical information-theoretic measures such as the entropy and the mutual information (MI) are widely applicable to many areas in science and engineering. Csiszar generalized the entropy and the MI by using the convex functions. Recently, we proposed the grid occupancy (GO) and the quasientropy (QE) as measures of independence. The QE explicitly includes a convex function in its definition, while the expectation of GO is a subclass of QE. In this paper, we study the effect of different convex functions on GO, QE, and Csiszar’s generalized mutual information (GMI). A quality factor (QF) is proposed to quantify the sharpness of their minima. Using the QF, it is shown that these measures can have sharper minima than the classical MI. Besides, a recursive algorithm for computing GMI, which is a generalization of Fraser and Swinney’s algorithm for computing MI, is proposed. Moreover, we apply GO, QE, and GMI to chaotic time series analysis. It is shown that these measures are good criteria for determining the optimum delay in strange attractor reconstruction.

[1]  O. Rössler An equation for continuous chaos , 1976 .

[2]  Yang Chen,et al.  Blind separation using convex functions , 2005, IEEE Transactions on Signal Processing.

[3]  F. Hartmann Heat and Thermodynamics , 2009 .

[4]  Mitchell G Longstaff,et al.  A nonlinear analysis of the temporal characteristics of handwriting , 1999 .

[5]  Bernhard Schölkopf,et al.  Measuring Statistical Dependence with Hilbert-Schmidt Norms , 2005, ALT.

[6]  E. Lorenz Deterministic nonperiodic flow , 1963 .

[7]  Maria L. Rizzo,et al.  Measuring and testing dependence by correlation of distances , 2007, 0803.4101.

[8]  Davor Juretic,et al.  The Maximum Entropy Production Principle and Linear Irreversible Processes , 2010, Entropy.

[9]  J. N. Kapur Maximum-entropy models in science and engineering , 1992 .

[10]  Jinghua Xu,et al.  Information transmission in human cerebral cortex , 1997 .

[11]  Marc M. Van Hulle,et al.  Increasing and Decreasing Returns and Losses in Mutual Information Feature Subset Selection , 2010, Entropy.

[12]  Bernhard Kolarczyk,et al.  Representing Entropy with Dispersion Sets , 2010, Entropy.

[13]  Fraser,et al.  Independent coordinates for strange attractors from mutual information. , 1986, Physical review. A, General physics.

[14]  F. Takens Detecting strange attractors in turbulence , 1981 .

[15]  Sang Joon Kim,et al.  A Mathematical Theory of Communication , 2006 .

[16]  Jagat Narain Kapur,et al.  Measures of information and their applications , 1994 .

[17]  A. Chattopadhyay,et al.  Heat and Thermodynamics , 1952 .

[18]  Bill Ravens,et al.  An Introduction to Copulas , 2000, Technometrics.

[19]  C. Tsallis Possible generalization of Boltzmann-Gibbs statistics , 1988 .

[20]  Yang Chen,et al.  A Novel Grid Occupancy Criterion for Independent Component Analysis , 2009, IEICE Trans. Fundam. Electron. Commun. Comput. Sci..

[21]  Hiroyuki Ohshima,et al.  Effect of Counterion and Configurational Entropy on the Surface Tension of Aqueous Solutions of Ionic Surfactant and Electrolyte Mixtures , 2010, Entropy.