Context-Specific Independence in Bayesian Networks

Bayesian networks provide a language for qualitatively representing the conditional independence properties of a distribution, This allows a natural and compact representation of the distribution, eases knowledge acquisition, and supports effective inference algorithms. It is well-known, however, that there are certain independencies that we cannot capture qualitatively within the Bayesian network structure: independencies that hold only in certain contexts, i.e., given a specific assignment of values to certain variables, In this paper, we propose a formal notion of context-specific independence (CSI), based on regularities in the conditional probability tables (CPTs) at a node. We present a technique, analogous to (and based on) d-separation, for determining when such independence holds in a given network. We then focus on a particular qualitative representation scheme--tree-structured CPTs-- for capturing CSI. We suggest ways in which this representation can be used to support effective inference algorithms, in particular, we present a structural decomposition of the resulting network which can improve the performance of clustering algorithms, and an alternative algorithm based on outset conditioning.

[1]  Randal E. Bryant,et al.  Graph-Based Algorithms for Boolean Function Manipulation , 1986, IEEE Transactions on Computers.

[2]  David J. Spiegelhalter,et al.  Local computations with probabilities on graphical structures and their application to expert systems , 1990 .

[3]  Judea Pearl,et al.  Probabilistic reasoning in intelligent systems - networks of plausible inference , 1991, Morgan Kaufmann series in representation and reasoning.

[4]  Frank Jensen,et al.  Approximations in Bayesian Belief Universe for Knowledge Based Systems , 2013, UAI 1990.

[5]  Jonathan Stillman,et al.  On Heuristics for Finding Loop Cutsets in Multiply-Connected Belief Networks , 2013, UAI 1990.

[6]  David Heckerman,et al.  Probabilistic similarity networks , 1991, Networks.

[7]  David Heckerman,et al.  Advances in Probabilistic Reasoning , 1994, Conference on Uncertainty in Artificial Intelligence.

[8]  Gregory F. Cooper,et al.  Initialization for the Method of Conditioning in Bayesian Belief Networks , 1991, Artif. Intell..

[9]  David Heckerman,et al.  Causal Independence for Knowledge Acquisition and Inference , 1993, UAI.

[10]  James E. Smith,et al.  Structuring Conditional Relationships in Influence Diagrams , 1993, Oper. Res..

[11]  Sampath Srinivas,et al.  A Generalization of the Noisy-Or Model , 1993, UAI.

[12]  David Poole,et al.  Probabilistic Horn Abduction and Bayesian Networks , 1993, Artif. Intell..

[13]  Judea Pearl,et al.  A Probabilistic Calculus of Actions , 1994, UAI.

[14]  David Heckerman,et al.  A New Look at Causal Independence , 1994, UAI.

[15]  Dan Geiger,et al.  Approximation Algorithms for the Loop Cutset Problem , 1994, UAI.

[16]  Sabine Glesner,et al.  Constructing Flexible Dynamic Belief Networks from First-Order Probalistic Knowledge Bases , 1995, ECSQARU.

[17]  Adnan Darwiche,et al.  Conditioning Algorithms for Exact and Approximate Inference in Causal Networks , 1995, UAI.

[18]  Craig Boutilier,et al.  Exploiting Structure in Policy Construction , 1995, IJCAI.

[19]  Nir Friedman,et al.  Learning Bayesian Networks with Local Structure , 1996, UAI.