A Two-Level Approach to Maximum Entropy Model Computation for Relational Probabilistic Logic Based on Weighted Conditional Impacts

The principle of maximum entropy allows to define the semantics of a knowledge base consisting of a set of probabilistic relational conditionals by a unique model having maximum entropy. Using the concept of a conditional structure of a world, we define the notion of weighted conditional impacts and present a two-level approach for maximum entropy model computation based on them. Once the weighted conditional impact of a knowledge base has been determined, a generalized iterative scaling algorithm is used that fully abstracts from concrete worlds. The weighted conditional impact may be reused when only the quantitative aspects of the knowledge base are changed. As a further extension of previous work, also deterministic conditionals may be present in the knowledge base, and a special treatment of such conditionals reduces the problem size.

[1]  E. W. Adams,et al.  The logic of conditionals , 1975 .

[2]  Gabriele Kern-Isberner,et al.  A Ranking Semantics for First-Order Conditionals , 2012, ECAI.

[3]  Antonio Krüger,et al.  KI 2012: Advances in Artificial Intelligence , 2012, Lecture Notes in Computer Science.

[4]  Nils J. Nilsson,et al.  Probabilistic Logic * , 2022 .

[5]  Leslie Pack Kaelbling,et al.  Lifted Probabilistic Inference with Counting Formulas , 2008, AAAI.

[6]  Donald Geman,et al.  Stochastic Relaxation, Gibbs Distributions, and the Bayesian Restoration of Images , 1984, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[7]  Jorge Nocedal,et al.  Algorithm 778: L-BFGS-B: Fortran subroutines for large-scale bound-constrained optimization , 1997, TOMS.

[8]  Christoph Beierle,et al.  Evaluation and Comparison Criteria for Approaches to Probabilistic Relational Knowledge Representation , 2011, KI.

[9]  Marc Finthammer,et al.  An integrated development environment for probabilistic relational reasoning , 2012, Log. J. IGPL.

[10]  Gabriele Kern-Isberner,et al.  Conditionals in Nonmonotonic Reasoning and Belief Revision: Considering Conditionals as Agents , 2001 .

[11]  Gabriele Kern-Isberner,et al.  Conditionals in Nonmonotonic Reasoning and Belief Revision , 2001, Lecture Notes in Computer Science.

[12]  Thomas Lukasiewicz,et al.  Combining probabilistic logic programming with the power of maximum entropy , 2004, Artif. Intell..

[13]  J. Paris The Uncertain Reasoner's Companion: A Mathematical Perspective , 1994 .

[14]  Christoph Beierle,et al.  Using Equivalences of Worlds for Aggregation Semantics of Relational Conditionals , 2012, KI.

[15]  Rodney W. Johnson,et al.  Axiomatic derivation of the principle of maximum entropy and the principle of minimum cross-entropy , 1980, IEEE Trans. Inf. Theory.

[16]  Judea Pearl,et al.  Probabilistic reasoning in intelligent systems - networks of plausible inference , 1991, Morgan Kaufmann series in representation and reasoning.

[17]  Gabriele Kern-Isberner,et al.  Novel Semantical Approaches to Relational Probabilistic Conditionals , 2010, KR.

[18]  Ronald Fagin,et al.  Reasoning about knowledge and probability , 1988, JACM.

[19]  Stefan Edelkamp,et al.  KI 2011: Advances in Artificial Intelligence , 2011, Lecture Notes in Computer Science.

[20]  Stephen P. Boyd,et al.  Convex Optimization , 2004, Algorithms and Theory of Computation Handbook.

[21]  Dan Roth,et al.  Lifted First-Order Probabilistic Inference , 2005, IJCAI.

[22]  Joseph Y. Halpern Reasoning about uncertainty , 2003 .