Probability distribution and entropy as a measure of uncertainty

The relationship between three probability distributions and their maximizable entropy forms is discussed without postulating entropy property. For this purpose, the entropy I is defined as a measure of uncertainty of the probability distribution of a random variable x by a variational relationship , a definition underlying the maximization of entropy for corresponding distribution.

[1]  E. Curado,et al.  Information theory based relations between thermodynamic's 1st. and 2nd. laws , 2006, cond-mat/0601076.

[2]  Raphaël F. Garcia,et al.  Seismic waves in the ionosphere , 2006 .

[3]  S. Ruffo,et al.  Controversy about the applicability of Tsallis statistics to the HMF model , 2006, cond-mat/0605445.

[4]  E. Curado,et al.  Equivalence between maximum entropy principle and enforcing dU = TdS. , 2005, Physical review. E, Statistical, nonlinear, and soft matter physics.

[5]  Qiuping A. Wang Non quantum uncertainty relations of stochastic dynamics , 2005 .

[6]  M. Newman Power laws, Pareto distributions and Zipf's law , 2005 .

[7]  M. Nauenberg Reply to ``Comment on `Critique of q-entropy for thermal statistics' '' , 2004 .

[8]  Ernesto P. Borges A possible deformed algebra and calculus inspired in nonextensive thermostatistics , 2003, cond-mat/0304545.

[9]  Q. A. Wang,et al.  Generalized algebra within a nonextensive statistics , 2003, math-ph/0303061.

[10]  Mark E. J. Newman,et al.  The Structure and Function of Complex Networks , 2003, SIAM Rev..

[11]  M. Nauenberg Critique of q-entropy for thermal statistics. , 2002, Physical review. E, Statistical, nonlinear, and soft matter physics.

[12]  Q. Wang Incomplete statistics: nonextensive generalizations of statistical mechanics , 2000, cond-mat/0009343.

[13]  C. E. SHANNON,et al.  A mathematical theory of communication , 1948, MOCO.

[14]  Z. K. Silagadze,et al.  Citations and the Zipf-Mandelbrot Law , 1999, Complex Syst..

[15]  Jos Uunk,et al.  Can the Maximum Entropy Principle Be Explained as a Consistency Requirement? , 1997 .

[16]  Jos Uffink,et al.  Can the maximum entropy principle be explained as a consistency requirement , 1995 .

[17]  Domingo Morales,et al.  A summary on entropy statistics , 1995, Kybernetika.

[18]  Wentian Li,et al.  Random texts exhibit Zipf's-law-like word frequency distribution , 1992, IEEE Trans. Inf. Theory.

[19]  C. Tsallis Possible generalization of Boltzmann-Gibbs statistics , 1988 .

[20]  E. T. Jaynes,et al.  The Evolution of Carnot’s Principle , 1988 .

[21]  R. G. Lacsamana,et al.  Where do we go from here? , 1986, The Journal of the Florida Medical Association.

[22]  A. Wehrl General properties of entropy , 1978 .

[23]  Zoltán Daróczy,et al.  Generalized Information Functions , 1970, Inf. Control..

[24]  Jan Havrda,et al.  Quantification method of classification processes. Concept of structural a-entropy , 1967, Kybernetika.

[25]  E. T. Jaynes Gibbs vs Boltzmann Entropies , 1965 .

[26]  Oscar H. IBARm Information and Control , 1957, Nature.

[27]  HE Ixtroductiont,et al.  The Bell System Technical Journal , 2022 .