On a simple derivation of a family of nonextensive entropies from information content

Abstract:The nonextensive entropy of Tsallis can be seen as a consequence of postulates on aself-information, i.e., the constant ratio of the first derivative of a self-information perunit probability to the curvature (second variation) of it. This constancy holds if weregard the probability distribution as the gradient of a self-information. Consideringthe form of the n th derivative of a self- information with keeping this constant ratio,we arrive at the general class of nonextensive entropies. Some properties on the seriesof entropies constructed by this picture are investigated.Keywords: Information theory; Nonadditive entropy; Information content.1 IntroductionTsallis statistics[1] can be seen as a formalism based on a pair of deformed functions of usualexponential and logarithmic ones[2, 3]. Their deformed functions are called the q -exponential andthe q -logarithmic functions, respectively. The q -exponential function is defined as e xq ≡ [1+(1 −q ) x ] 1 / 1 −q + ,where[ x ] = max{x,

[1]  Sumiyoshi Abe,et al.  Statistical mechanical foundations of power-law distributions , 2004 .

[2]  C. Tsallis Possible generalization of Boltzmann-Gibbs statistics , 1988 .

[3]  F. Sattin Derivation of Tsallis statistics from dynamical equations for a granular gas , 2003 .

[4]  C. Tsallis What are the Numbers that Experiments Provide , 1994 .

[5]  R. Hartley Transmission of information , 1928 .

[6]  Raúl Dante Rossignoli,et al.  Non-additive entropies and quantum statistics , 1999 .

[7]  渡辺 慧,et al.  Knowing and guessing : a quantitative study of inference and information , 1969 .

[8]  P. Landsberg,et al.  Distributions and channel capacities in generalized statistical mechanics , 1998 .

[9]  J. Naudts Deformed exponentials and logarithms in generalized thermostatistics , 2002, cond-mat/0203489.

[10]  Dick E. Boekee,et al.  A generalized class of certainty and information measures , 1982, Inf. Sci..

[11]  Zoltán Daróczy,et al.  Generalized Information Functions , 1970, Inf. Control..

[12]  Celia Anteneodo,et al.  Nonextensive statistical mechanics and economics , 2003, ArXiv.

[13]  Christian Beck Dynamical Foundations of Nonextensive Statistical Mechanics , 2001 .

[14]  C. Beck,et al.  Thermodynamics of chaotic systems , 1993 .

[15]  C. H. Chen,et al.  On information and distance measures, error bounds, and feature selection , 1976, Information Sciences.

[16]  Funabashi,et al.  Implications of Form Invariance to the Structure of Nonextensive Entropies , 1999, quant-ph/9904029.

[17]  T. Yamano Information theory based on nonadditive information content. , 2000, Physical review. E, Statistical, nonlinear, and soft matter physics.

[18]  Celia Anteneodo,et al.  Maximum entropy approach to stretched exponential probability distributions , 1999 .

[19]  Satosi Watanabe,et al.  Knowing and guessing , 1969 .

[20]  D. Vernon Inform , 1995, Encyclopedia of the UN Sustainable Development Goals.

[21]  Jan C. van der Lubbe,et al.  Information theory , 1997 .

[22]  G. G. Stokes "J." , 1890, The New Yale Book of Quotations.

[23]  Andrés R R Papa On one-parameter-dependent generalizations of Boltzmann-Gibbs statistical mechanics , 1998 .