Conditional logic and the Principle of Entropy

Abstract The conditional three-valued logic of Calabrese is applied to the language L ∗ of conditionals on propositional variables with finite domain. The conditionals in L ∗ serve as a means for the construction and manipulation of probability distributions respecting the Principle of Maximum Entropy and of Minimum Relative Entropy. This principle allows a sound inference even in the presence of uncertain evidence. The inference is directed, it respects a probabilistic version of Modus Ponens—not of Modus Tollens—, it permits transitive chaining and supports a cautious monotony. Conjunctive, conditional and material deduction are manageable in this probabilistic logic, too. The concept is not merely theoretical, but enables large-scale applications in the expert system-shell SPIRIT.

[1]  Gabriele Kern-Isberner,et al.  Characterizing the Principle of Minimum Cross-Entropy Within a Conditional-Logical Framework , 1998, Artif. Intell..

[2]  M. Donald On the relative entropy , 1986 .

[3]  Wilhelm Rödder,et al.  Coherent Knowledge Processing at Maximum Entropy by SPIRIT , 1996, UAI.

[4]  P. M. Williams Bayesian Conditionalisation and the Principle of Minimum Information , 1980, The British Journal for the Philosophy of Science.

[5]  John E. Shore,et al.  Relative Entropy, Probabilistic Inference, and AI , 1985, UAI.

[6]  Patrick Brézillon,et al.  Lecture Notes in Artificial Intelligence , 1999 .

[7]  I. Csiszár $I$-Divergence Geometry of Probability Distributions and Minimization Problems , 1975 .

[8]  A. Tarski,et al.  Wahrscheinlichkeitslehre und mehrwertige Logik: , 1935 .

[9]  Frans Voorbraak,et al.  Probabilistic Belief Expansion and Conditioning , 1996 .

[10]  Wilhelm Rödder,et al.  Entropy-driven inference and inconsistency , 1999, AISTATS.

[11]  Judea Pearl,et al.  Probabilistic reasoning in intelligent systems , 1988 .

[12]  J. Hintikka,et al.  Aspects of Inductive Logic. , 1968 .

[13]  Gabriele Kern-Isberner A Logically Sound Method for Uncertain Reasoning with Quantified Conditionals , 1997, ECSQARU-FAPR.

[14]  P. G. Calabrese,et al.  A Theory of Conditional Information with Applications , 1994, IEEE Trans. Syst. Man Cybern. Syst..

[15]  N. Rescher Many Valued Logic , 1969 .

[16]  Rodney W. Johnson,et al.  Axiomatic derivation of the principle of maximum entropy and the principle of minimum cross-entropy , 1980, IEEE Trans. Inf. Theory.

[17]  Andrew P. Sage,et al.  Uncertainty in Artificial Intelligence , 1987, IEEE Transactions on Systems, Man, and Cybernetics.

[18]  Carl-Heinz Meyer Korrektes Schließen bei unvollständiger Information: Anwendung des Prinzips der maximalen Entropie in einem probabilistischen Expertensystem , 1998 .

[19]  Madan M. Gupta,et al.  Conditional Logic in Expert Systems , 1991 .

[20]  Didier Dubois,et al.  The logical view of conditioning and its application to possibility and evidence theories , 1990, Int. J. Approx. Reason..

[21]  I. Csiszár Why least squares and maximum entropy? An axiomatic approach to inference for linear inverse problems , 1991 .

[22]  Camilla Schwind,et al.  Schließen bei unsicherem Wissen in der Künstlichen Intelligenz , 1992 .

[23]  P. Suppes Probabilistic Inference and the Concept of Total Evidence , 1969 .

[24]  Jeff B. Paris,et al.  A note on the inevitability of maximum entropy , 1990, Int. J. Approx. Reason..

[25]  P. Calabrese Deduction and Inference Using Conditional Logic and Probability , 1991 .

[26]  Gabriele Kern-Isberner,et al.  Representation and Extraction of Information by Probabilistic Logic , 1996, Inf. Syst..

[27]  Atwell R. Turquette,et al.  On the Many-Valued Logics , 1941 .

[28]  David J. Spiegelhalter,et al.  Local computations with probabilities on graphical structures and their application to expert systems , 1990 .