CPD Tree Learning Using Contexts as Background Knowledge

Context specific independence (CSI) is an efficient means to capture independencies that hold only in certain contexts. Inference algorithms based on CSI are capable to learn the Conditional Probability Distribution (CPD) tree relative to a target variable. We model motifs as specific contexts that are recurrently observed in data. These motifs can thus constitute a domain knowledge which can be incorporated into a learning procedure. We show that the integration of this prior knowledge provides better learning performances and facilitates the interpretation of local structure.

[1]  Nir Friedman,et al.  Learning Bayesian Networks with Local Structure , 1996, UAI.

[2]  Szymon Jaroszewicz,et al.  Scalable pattern mining with Bayesian networks as background knowledge , 2009, Data Mining and Knowledge Discovery.

[3]  Martin Neil,et al.  Building large-scale Bayesian networks , 2000, The Knowledge Engineering Review.

[4]  A Min Tjoa,et al.  Ontology-Based Generation of Bayesian Networks , 2009, 2009 International Conference on Complex, Intelligent and Software Intensive Systems.

[5]  Craig Boutilier,et al.  Context-Specific Independence in Bayesian Networks , 1996, UAI.

[6]  Nahla Ben Amor,et al.  SemCaDo: A Serendipitous Strategy for Learning Causal Bayesian Networks Using Ontologies , 2011, ECSQARU.

[7]  Serafín Moral,et al.  Recursive Probability Trees for Bayesian Networks , 2009, CAEPIA.

[8]  Ronald L. Rivest,et al.  Inferring Decision Trees Using the Minimum Description Length Principle , 1989, Inf. Comput..

[9]  Eugene Santos,et al.  Bayesian knowledge base tuning , 2013, Int. J. Approx. Reason..