Information-theoretic characterizations of conditional mutual independence and Markov random fields

We take the point of view that a Markov random field is a collection of so-called full conditional mutual independencies. Using the theory of I-measure, we have obtained a number of fundamental characterizations related to conditional mutual independence and Markov random fields. We show that many aspects of conditional mutual independence and Markov random fields have very simple set-theoretic descriptions. New insights into the structure of conditional mutual independence and Markov random fields are obtained. Our results have immediate applications in the implication problem of probabilistic conditional independency and relational database. We obtain a hypergraph characterization of a Markov random field which makes it legitimate to view a Markov random field as a hypergraph. Based on this result, we naturally employ the Graham reduction, a tool from relational database theory, to recognize a Markov forest. This connection between Markov random fields and hypergraph sheds some light on the possible role of hypergraph theory in the study of Markov random fields.

[1]  J. Laurie Snell,et al.  Markov Random Fields and Their Applications , 1980 .

[2]  Francesco M. Malvestuto,et al.  Comment on "A unique formal system for binary decompositions of database relations, probability distributions, and graphs" , 1992, Inf. Sci..

[3]  R. Yeung On Entropy, Information Inequalities, and Groups , 2003 .

[4]  R. Yeung,et al.  2cterization of Entropy Function via Information Inequalities , 1998 .

[5]  Dan Wu,et al.  On the implication problem for probabilistic conditional independency , 2000, IEEE Trans. Syst. Man Cybern. Part A.

[6]  J. Pearl,et al.  Logical and Algorithmic Properties of Conditional Independence and Graphical Models , 1993 .

[7]  A. Cohen On random fields , 1967 .

[8]  Raymond W. Yeung,et al.  A First Course in Information Theory , 2002 .

[9]  Raymond W. Yeung,et al.  Multilevel diversity coding with distortion , 1995, IEEE Trans. Inf. Theory.

[10]  Judea Pearl,et al.  GRAPHOIDS: A Graph-based logic for reasoning about relevance relations , 1985 .

[11]  Tony T. Lee,et al.  An Infornation-Theoretic Analysis of Relational Databases—Part I: Data Dependencies and Information Metric , 1987, IEEE Transactions on Software Engineering.

[12]  Judea Pearl,et al.  GRAPHOIDS: Graph-Based Logic for Reasoning about Relevance Relations OrWhen Would x Tell You More about y If You Already Know z? , 1986, ECAI.

[13]  Catriel Beeri,et al.  On the Desirability of Acyclic Database Schemes , 1983, JACM.

[14]  Raymond W. Yeung,et al.  A framework for linear information inequalities , 1997, IEEE Trans. Inf. Theory.

[15]  S. K. Michael Wong,et al.  Testing Implication of Probabilistic Dependencies , 1996, UAI.

[16]  Steffen L. Lauritzen,et al.  Graphical models in R , 1996 .

[17]  Raymond W. Yeung,et al.  A new outlook of Shannon's information measures , 1991, IEEE Trans. Inf. Theory.

[18]  Francesco M. Malvestuto A unique formal system for binary decompositions of database relations, probability distributions, and graphs , 1992, Inf. Sci..

[19]  Zhen Zhang,et al.  Distributed Source Coding for Satellite Communications , 1999, IEEE Trans. Inf. Theory.

[20]  Judea Pearl,et al.  Probabilistic reasoning in intelligent systems , 1988 .

[21]  Tsutomu Kawabata,et al.  The structure of the I-measure of a Markov chain , 1992, IEEE Trans. Inf. Theory.