Abduction for Discourse Interpretation: A Probabilistic Framework

Abduction allows us to model interpretation of discourse as the explanation of observables, given additional knowledge about the world. In an abductive framework, many explanations can be constructed for the same observation, requiring an approach to estimate the likelihood of these alternative explanations. We show that, for discourse interpretation, weighted abduction has advantages over alternative approaches to estimating the likelihood of hypotheses. However, weighted abduction has no probabilistic interpretation, which makes the estimation and learning of weights difficult. To address this, we propose a formal probabilistic abductive framework that captures the advantages weighted abduction when applied to discourse interpretation.

[1]  Ekaterina Ovchinnikova,et al.  Integration of World Knowledge for Natural Language Understanding , 2012, Atlantis Thinking Machines.

[2]  Jerry R. Hobbs,et al.  Coreference Resolution with ILP-based Weighted Abduction , 2012, COLING.

[3]  Kentaro Inui,et al.  ILP-Based Reasoning for Weighted Abduction , 2011, Plan, Activity, and Intent Recognition.

[4]  Jerry R. Hobbs,et al.  Implementing Weighted Abduction in Markov Logic , 2011, IWCS.

[5]  Helge Langseth,et al.  Parameter Learning in Object-Oriented Bayesian Networks , 2001, Annals of Mathematics and Artificial Intelligence.

[6]  Robert P. Goldman,et al.  A Semantics for Probabilistic Quantifier-Free First-Order Languages, with Particular Application to Story Understanding , 1989, IJCAI.

[7]  Inui Kentaro,et al.  Online Large-margin Weight Learning for First-order Logic-based Abduction , 2012 .

[8]  Paola Sebastiani,et al.  c ○ 2001 Kluwer Academic Publishers. Manufactured in The Netherlands. Robust Learning with Missing Data , 2022 .

[9]  Qiang Ji,et al.  Learning Bayesian network parameters under incomplete data with domain knowledge , 2009, Pattern Recognit..

[10]  Rohit J. Kate and Raymond J. Mooney Probabilistic Abduction using Markov Logic Networks , 2009 .

[11]  Eugene Charniak,et al.  Effective Self-Training for Parsing , 2006, NAACL.

[12]  Jerry R. Hobbs,et al.  Abductive Reasoning with a Large Knowledge Base for Discourse Processing , 2011, IWCS.

[13]  Jerry R. Hobbs,et al.  Weighted Abduction for Discourse Processing Based on Integer Linear Programming , 2014 .

[14]  Solomon Eyal Shimony,et al.  Probabilistic Semantics for Cost Based Abduction , 1990, AAAI.

[15]  Raymond Mooney,et al.  Bayesian Abductive Logic Programs , 2010, StarAI@AAAI.

[16]  D. Rubin,et al.  Maximum likelihood from incomplete data via the EM - algorithm plus discussions on the paper , 1977 .

[17]  Johan Bos,et al.  A Survey of Computational Semantics: Representation, Inference and Knowledge in Wide-Coverage Text Understanding , 2011, Lang. Linguistics Compass.

[18]  Matthew Richardson,et al.  Markov logic networks , 2006, Machine Learning.

[19]  Judea Pearl,et al.  Probabilistic reasoning in intelligent systems - networks of plausible inference , 1991, Morgan Kaufmann series in representation and reasoning.

[20]  Liang Huang,et al.  Forest Reranking: Discriminative Parsing with Non-Local Features , 2008, ACL.