Derivation of an amplitude of information in the setting of a new family of fractional entropies

By generalizing the basic functional equation f(xy)=f(x)+f(y) in the form f^@b(xy)=f^@b(x)+f^@b(y), @b>1, one can derive a family of solutions which are exactly the inverse of the Mittag-Leffler function, referred to as Mittag-Leffler logarithm, or logarithm of fractional order. This result provides a new family of generalized informational entropies which are indexed by a parameter clearly related to fractals, via fractional calculus, and which is quite relevant in the presence in defect of observation. The relation with Shannon's entropy, Renyi's entropy and Tsallis' entropy is clarified, and it is shown that Tsallis' generalized logarithm has a significance in terms of fractional calculus. The case @b=2 looks like directly relevant to amplitude of probability in quantum mechanics, and provides an approach to the definition of ''amplitude of informational entropy''. One examines the kind of result one can so obtain in applying the maximum entropy principle. In the presence of uncertain definition (or fuzzy definition) the Mittag-Leffler function would be more relevant than the Gaussian normal law. To some extent, this new formulation could be fully supported by the derivation of a new family of fractional Fisher information.

[1]  C. Tsallis Possible generalization of Boltzmann-Gibbs statistics , 1988 .

[2]  Márcio Portes de Albuquerque,et al.  Image thresholding using Tsallis entropy , 2004, Pattern Recognit. Lett..

[3]  Jiye Liang,et al.  A new method for measuring uncertainty and fuzziness in rough set theory , 2002, Int. J. Gen. Syst..

[4]  Jiye Liang,et al.  Knowledge structure, knowledge granulation and knowledge distance in a knowledge base , 2009, Int. J. Approx. Reason..

[5]  G. Jumarie,et al.  Modified Riemann-Liouville derivative and fractional Taylor series of nondifferentiable functions further results , 2006, Comput. Math. Appl..

[6]  H. Kober ON FRACTIONAL INTEGRALS AND DERIVATIVES , 1940 .

[7]  G. Jumarie Maximum Entropy, Information Without Probability and Complex Fractals: Classical and Quantum Approach , 2000 .

[8]  B. Ross,et al.  Fractional Calculus and Its Applications , 1975 .

[9]  Jan Havrda,et al.  Quantification method of classification processes. Concept of structural a-entropy , 1967, Kybernetika.

[10]  I. Podlubny Fractional differential equations , 1998 .

[11]  Ivo Düntsch,et al.  Uncertainty Measures of Rough Set Prediction , 1998, Artif. Intell..

[12]  Kiran M. Kolwankar,et al.  Hölder exponents of irregular signals and local fractional derivatives , 1997, chao-dyn/9711010.

[13]  Ronald R. Yager,et al.  Entropy measures under similarity relations , 1992 .

[14]  Guy Jumarie,et al.  On the solution of the stochastic differential equation of exponential growth driven by fractional Brownian motion , 2005, Appl. Math. Lett..

[15]  Congjun Wu UNCONVENTIONAL BOSE–EINSTEIN CONDENSATIONS BEYOND THE "NO-NODE" THEOREM , 2009, 0901.1415.

[16]  R. Gray Source Coding Theory , 1989 .

[17]  Xin Chen,et al.  Cross-fuzzy entropy: A new method to test pattern synchrony of bivariate time series , 2010, Inf. Sci..

[18]  E. Jaynes Information Theory and Statistical Mechanics , 1957 .

[19]  B. Frieden,et al.  Physics from Fisher Information: A Unification , 1998 .

[20]  G. Jumarie Relative Information: Theories and Applications , 2011 .

[21]  Kiran M. Kolwankar,et al.  Local Fractional Fokker-Planck Equation , 1998 .

[22]  Somayeh Zarezadeh,et al.  Results on residual Rényi entropy of order statistics and record values , 2010, Inf. Sci..

[23]  Jean-Luc Marichal,et al.  Aggregation operators for multicriteria decision aid , 1998 .

[24]  L. M. B. C. Campos,et al.  On a Concept of Derivative of Complex Order with Applications to Special Functions , 1984 .

[25]  M. Caputo Linear Models of Dissipation whose Q is almost Frequency Independent-II , 1967 .

[26]  G. Jumarie ON SOME SIMILARITIES AND DIFFERENCES BETWEEN FRACTIONAL PROBABILITY DENSITY SIGNED MEASURE OF PROBABILITY AND QUANTUM PROBABILITY , 2009 .

[27]  O. Marichev,et al.  Fractional Integrals and Derivatives: Theory and Applications , 1993 .

[28]  G. Klir,et al.  Uncertainty-based information: Elements of generalized information theory (studies in fuzziness and soft computing). , 1998 .

[29]  Xizhao Wang,et al.  Induction of multiple fuzzy decision trees based on rough set technique , 2008, Inf. Sci..

[30]  J. Aczel,et al.  On Measures of Information and Their Characterizations , 2012 .

[31]  G. Jumarie Stochastic differential equations with fractional Brownian motion input , 1993 .

[32]  T. Osler Taylor’s Series Generalized for Fractional Derivatives and Applications , 1971 .

[33]  M. Al-Akaidi Fractal Speech Processing , 2004 .

[34]  J. Ross Quinlan,et al.  Induction of Decision Trees , 1986, Machine Learning.

[35]  K. Miller,et al.  An Introduction to the Fractional Calculus and Fractional Differential Equations , 1993 .

[36]  Guy Jumarie,et al.  On the representation of fractional Brownian motion as an integral with respect to (dt)alpha , 2005, Appl. Math. Lett..

[37]  H. Srivastava,et al.  Theory and Applications of Fractional Differential Equations , 2006 .

[38]  Xi-Zhao Wang,et al.  Improving Generalization of Fuzzy IF--THEN Rules by Maximizing Fuzzy Entropy , 2009, IEEE Transactions on Fuzzy Systems.

[39]  Jiye Liang,et al.  Combination Entropy and Combination Granulation in Rough Set Theory , 2008, Int. J. Uncertain. Fuzziness Knowl. Based Syst..

[40]  Marian Grendár,et al.  The Pólya information divergence , 2010, Inf. Sci..

[41]  C. E. SHANNON,et al.  A mathematical theory of communication , 1948, MOCO.

[42]  Guy Jumarie,et al.  New stochastic fractional models for Malthusian growth, the Poissonian birth process and optimal management of populations , 2006, Math. Comput. Model..

[43]  K. B. Oldham,et al.  The Fractional Calculus: Theory and Applications of Differentiation and Integration to Arbitrary Order , 1974 .

[44]  H. V. Ribeiro,et al.  Symbolic sequences and Tsallis entropy , 2009, 1001.2855.