Deceptive updating and minimal information methods
暂无分享,去创建一个
[1] R. Jeffrey,et al. The Philosophy of Rudolf Carnap , 1966 .
[2] L. J. Savage. The Foundations of Statistical Inference. , 1963 .
[3] Jeff B. Paris. Common Sense and Maximum Entropy , 2004, Synthese.
[4] L. J. Savage,et al. The Foundations of Statistics , 1955 .
[5] Taylor Francis Online,et al. The American statistician , 1947 .
[6] R. Baierlein. Probability Theory: The Logic of Science , 2004 .
[7] D. A. Sprott,et al. Foundations of Statistical Inference. , 1972 .
[8] I. Levi. Imprecision and Indeterminacy in Probability Judgment , 1985, Philosophy of Science.
[9] Haim Gaifman,et al. Reasoning with Limited Resources and Assigning Probabilities to Arithmetical Statements , 2004, Synthese.
[10] Jeff B. Paris,et al. In defense of the maximum entropy inference process , 1997, Int. J. Approx. Reason..
[11] Teddy Seidenfeld. Entropy and Uncertainty , 1986, Philosophy of Science.
[12] R. A. Leibler,et al. On Information and Sufficiency , 1951 .
[13] Sang Joon Kim,et al. A Mathematical Theory of Communication , 2006 .
[14] K. Friedman,et al. Jaynes's maximum entropy prescription and probability theory , 1971 .
[15] Joseph Y. Halpern,et al. Probability Update: Conditioning vs. Cross-Entropy , 1997, UAI.
[16] J. Keynes. A Treatise on Probability. , 1923 .
[17] H. Putnam. Mathematics, Matter and Method: ‘Degree of confirmation’ and inductive logic , 1979 .
[18] Haim Gaifman,et al. Paradoxes of infinity and self-applications, I , 1983 .
[19] Abner Shimony,et al. Comment on the interpretation of inductive probabilities , 1973 .
[20] Marc Snir,et al. Probabilities over rich languages, testing and randomness , 1982, Journal of Symbolic Logic.
[21] Haim Gaifman,et al. A Theory of Higher Order Probabilities , 1986, TARK.
[22] E. T. Jaynes,et al. Where do we Stand on Maximum Entropy , 1979 .
[23] Rodney W. Johnson,et al. Axiomatic derivation of the principle of maximum entropy and the principle of minimum cross-entropy , 1980, IEEE Trans. Inf. Theory.
[24] Van Fraassen,et al. A PROBLEM FOR RELATIVE INFORMATION MINIMIZERS IN PROBABILITY KINEMATICS , 1981 .
[25] E. Jaynes. Information Theory and Statistical Mechanics , 1957 .
[26] A. Hobson,et al. A comparison of the Shannon and Kullback information measures , 1973 .
[27] D. F. Kerridge,et al. The Logic of Decision , 1967 .
[28] B. Ripley,et al. E. T. Jaynes: Papers on Probability, Statistics and Statistical Physics , 1983 .
[29] C. E. SHANNON,et al. A mathematical theory of communication , 1948, MOCO.
[30] L. M. M.-T.. Theory of Probability , 1929, Nature.
[31] B. M. Hill,et al. Theory of Probability , 1990 .
[32] Edwin T. Jaynes,et al. Prior Probabilities , 1968, Encyclopedia of Machine Learning.
[33] D. V. Gokhale,et al. Theory of Probability, Vol. I , 1975 .
[34] Teddy Seidenfeld. Why I am not an objective Bayesian; some reflections prompted by Rosenkrantz , 1979 .
[35] A. Shimony. The status of the principle of maximum entropy , 1985, Synthese.