Axiomatic Characterizations of Information Measures
暂无分享,去创建一个
[1] J. Neumann. Thermodynamik quantenmechanischer Gesamtheiten , 1927 .
[2] A. Bhattacharyya. On a measure of divergence between two statistical populations defined by their probability distributions , 1943 .
[3] R. A. Leibler,et al. On Information and Sufficiency , 1951 .
[4] E. Jaynes. Information Theory and Statistical Mechanics , 1957 .
[5] H. Tverberg. A New Derivation of the Information Function. , 1958 .
[6] I. N. Sanov. On the probability of large deviations of random variables , 1958 .
[7] S. Kullback,et al. Information Theory and Statistics , 1959 .
[8] Solomon Kullback,et al. Information Theory and Statistics , 1960 .
[9] T. Chaundy,et al. On a Functional Equation , 1960 .
[10] A. Rényi. On Measures of Entropy and Information , 1961 .
[11] R. Ingarden,et al. Information without probability , 1962 .
[12] Z. Daróczy. über die gemeinsame Charakterisierung der zu den nicht vollstÄndigen Verteilungen gehörigen Entropien von Shannon und von Rényi , 1963 .
[13] Z. Daróczy. Über Mittelwerte und Entropien Vollständiger Wahrscheinlichkeitsverteilungen , 1964 .
[14] P. M. Lee. On the Axioms of Information Theory , 1964 .
[15] L. L. Campbell,et al. A Coding Theorem and Rényi's Entropy , 1965, Inf. Control..
[16] A. Rényi. On the Foundations of Information Theory , 1965 .
[17] S. M. Ali,et al. A General Class of Coefficients of Divergence of One Distribution from Another , 1966 .
[18] L. Bregman. The relaxation method of finding the common point of convex sets and its application to the solution of problems in convex programming , 1967 .
[19] Jan Havrda,et al. Quantification method of classification processes. Concept of structural a-entropy , 1967, Kybernetika.
[20] J. Andel. Sequential Analysis , 2022, The SAGE Encyclopedia of Research Design.
[21] R. Sibson. Information radius , 1969 .
[22] Zoltán Daróczy,et al. Generalized Information Functions , 1970, Inf. Control..
[23] Z. Daróczy. On the measurable solutions of a functional equation , 1971 .
[24] Suguru Arimoto,et al. Information-Theoretical Considerations on Estimation Problems , 1971, Inf. Control..
[25] I. Csiszár. A class of measures of informativity of observation channels , 1972 .
[26] P. Fischer. On the inequality Σpif(pi)≥ pif(qi) , 1972 .
[27] C. T. Ng,et al. Measurable solutions of functional equations related to information theory , 1973 .
[28] C. T. Ng,et al. Why the Shannon and Hartley entropies are ‘natural’ , 1974, Advances in Applied Probability.
[29] C. T. Ng,et al. A functional equation and its application to information theory , 1974 .
[30] George T. Diderrich,et al. The Role of Boundedness in Characterizing Shannon Entropy , 1975, Inf. Control..
[31] János Aczél,et al. A mixed theory of information. I: symetric, recursive and measurable entropies of randomized systems of events , 1978, RAIRO Theor. Informatics Appl..
[32] Moshe Ben-Bassat,et al. f-Entropies, probability of Error, and Feature Selection , 1978, Inf. Control..
[33] Rodney W. Johnson,et al. Axiomatic derivation of the principle of maximum entropy and the principle of minimum cross-entropy , 1980, IEEE Trans. Inf. Theory.
[34] Gyula Maksa. On the bounded solutions of a functional equation , 1981 .
[35] C. Tsallis. Possible generalization of Boltzmann-Gibbs statistics , 1988 .
[36] Charles L. Byrne,et al. General entropy criteria for inverse problems, with applications to data compression, pattern classification, and cluster analysis , 1990, IEEE Trans. Inf. Theory.
[37] Jeff B. Paris,et al. A note on the inevitability of maximum entropy , 1990, Int. J. Approx. Reason..
[38] I. Csiszár. Why least squares and maximum entropy? An axiomatic approach to inference for linear inverse problems , 1991 .
[39] I. Csiszár. Generalized Cutoff Rates and Renyi's Information Measures , 1993, Proceedings. IEEE International Symposium on Information Theory.
[40] Generalized privacy amplification , 1994, Proceedings of 1994 IEEE International Symposium on Information Theory.
[41] Imre Csiszár. Generalized cutoff rates and Renyi's information measures , 1995, IEEE Trans. Inf. Theory.
[42] P. K. Sahoo,et al. Characterizations of information measures , 1998 .
[43] R. Yeung,et al. 2cterization of Entropy Function via Information Inequalities , 1998 .
[44] Raymond W. Yeung,et al. A First Course in Information Theory , 2002 .
[45] A. Dawid,et al. Game theory, maximum entropy, minimum discrepancy and robust Bayesian decision theory , 2004, math/0410076.
[46] Rudolf Ahlswede,et al. An Interpretation of Identification Entropy , 2006, IEEE Transactions on Information Theory.
[47] Sang Joon Kim,et al. A Mathematical Theory of Communication , 2006 .
[48] Frantisek Matús,et al. Infinitely Many Information Inequalities , 2007, 2007 IEEE International Symposium on Information Theory.