A generalized class of certainty and information measures

Unlike the usual definition of measures of information in terms of the relation between information and uncertainty, a different approach is followed in this paper, in which the relation between certainty and information plays a central role. Information measures are introduced with the help of certainty measures within this framework. This approach leads to three generalized classes of information measures and unifies the known measures of information into one generalized probabilistic theory of discrete information measures.

[1]  E. H. Simpson Measurement of Diversity , 1949, Nature.

[2]  C. H. Chen,et al.  On information and distance measures, error bounds, and feature selection , 1976, Information Sciences.

[3]  Bhu Dev Sharma,et al.  On amount of information of type-β and other measures , 1972 .

[4]  N. Ahmed,et al.  Order preserving measures of information , 1973, Journal of Applied Probability.

[5]  A. Rényi On Measures of Entropy and Information , 1961 .

[6]  Dick E. Boekee,et al.  The R-Norm Information Measure , 1980, Inf. Control..

[7]  T N Bhargava,et al.  A geometric study of diversity. , 1974, Journal of theoretical biology.

[8]  Dick E. Boekee,et al.  Some aspects of error bounds in feature selection , 1979, Pattern Recognit..

[9]  E. C. Pielou An introduction to mathematical ecology , 1970 .

[10]  Suguru Arimoto,et al.  Information-Theoretical Considerations on Estimation Problems , 1971, Inf. Control..

[11]  J.C.A. Van der Lubbe A generalized probabilistic theory of the measurement of certainty and information , 1981 .

[12]  J. Walter Wilson Science and Imagination. Selected Papers of Warren Weaver. Basic Books, New York, 1967. xvi + 295 pp. $5.95 , 1968 .

[13]  Pierre A. Devijver,et al.  Entropie Quadratique et Reconnaissance Des Formes , 1976 .

[14]  Warren Weaver Science and imagination : selected papers of Warren Weaver , 1967 .

[15]  P. N. Rathie Generalized entropies in coding theory , 1972 .

[16]  C. E. SHANNON,et al.  A mathematical theory of communication , 1948, MOCO.

[17]  Prem Nath Entropy, inaccuracy and information , 1968 .

[18]  J. Aczel,et al.  On different characterizations of entropies , 1969 .

[19]  渡辺 慧,et al.  Knowing and guessing : a quantitative study of inference and information , 1969 .

[20]  G. Bruckmann,et al.  Einige Bemerkungen zur statistischen Messung der Konzentration , 1969 .

[21]  Zoltán Daróczy,et al.  Generalized Information Functions , 1970, Inf. Control..

[22]  Jan Havrda,et al.  Quantification method of classification processes. Concept of structural a-entropy , 1967, Kybernetika.

[23]  Charles F. Hockett,et al.  A mathematical theory of communication , 1948, MOCO.

[24]  Claude E. Shannon,et al.  The mathematical theory of communication , 1950 .