Minimal Characterization of Shannon-Type Inequalities Under Functional Dependence and Full Conditional Independence Structures

The minimal set of Shannon-type inequalities (or elemental inequalities) plays a central role in efficiently determining whether a given inequality is in fact Shannon-type or not and in computing the linear programming bound for network coding capacity. In many cases, random variables under consideration are subject to additional constraints, such as functional dependence and conditional independence constraints. For example, functional dependence constraints are common in many communication problems due to deterministic encoding and decoding constraints. In other situations, the variables involved may form a Markov chain or in general a Markov random field, leading to conditional independence constraint. Subject to additional constraints, the challenge is how to identify the non-redundant inequalities. While one can always numerically determine the non-redundant inequalities (subject to additional linear equality constraints), it will be instrumental and also important if the non-redundant inequalities can be listed explicitly. In this paper, we show that this is achievable under the functional dependence and full conditional independence constraints.

[1]  Alex J. Grant,et al.  A minimal set of shannon-type inequalities for functional dependence structures , 2017, 2017 IEEE International Symposium on Information Theory (ISIT).

[2]  Raymond W. Yeung,et al.  Information-theoretic characterizations of conditional mutual independence and Markov random fields , 2002, IEEE Trans. Inf. Theory.

[3]  Zhen Zhang,et al.  A non-Shannon-type conditional inequality of information quantities , 1997, IEEE Trans. Inf. Theory.

[4]  Zhen Zhang,et al.  Distributed Source Coding for Satellite Communications , 1999, IEEE Trans. Inf. Theory.

[5]  Satyajit Thakor,et al.  On Complexity Reduction of the LP Bound Computation and Related Problems , 2011, 2011 International Symposium on Networking Coding.

[6]  Alex J. Grant,et al.  A Minimal Set of Shannon-type Inequalities for MRF Structures with Functional Dependencies , 2018, 2018 IEEE International Symposium on Information Theory (ISIT).

[7]  Raymond W. Yeung,et al.  Information Theory and Network Coding , 2008 .

[8]  Raymond W. Yeung,et al.  A new outlook of Shannon's information measures , 1991, IEEE Trans. Inf. Theory.

[9]  Alex J. Grant,et al.  Cut-Set Bounds on Network Information Flow , 2013, IEEE Transactions on Information Theory.

[10]  Alex J. Grant,et al.  Network Coding Capacity Regions via Entropy Functions , 2012, IEEE Transactions on Information Theory.

[11]  Alex J. Grant,et al.  Network coding capacity: A functional dependence bound , 2009, 2009 IEEE International Symposium on Information Theory.

[12]  James G. Oxley,et al.  Matroid theory , 1992 .

[13]  Rudolf Ahlswede,et al.  Network information flow , 2000, IEEE Trans. Inf. Theory.

[14]  Raymond W. Yeung,et al.  A framework for linear information inequalities , 1997, IEEE Trans. Inf. Theory.