Any unconstrained information inequality in three or fewer random variables can be written as a linear combination of instances of Shannon's inequality I(A;B|C) >= 0 . Such inequalities are sometimes referred to as "Shannon" inequalities. In 1998, Zhang and Yeung gave the first example of a "non-Shannon" information inequality in four variables. Their technique was to add two auxiliary variables with special properties and then apply Shannon inequalities to the enlarged list. Here we will show that the Zhang-Yeung inequality can actually be derived from just one auxiliary variable. Then we use their same basic technique of adding auxiliary variables to give many other non-Shannon inequalities in four variables. Our list includes the inequalities found by Xu, Wang, and Sun, but it is by no means exhaustive. Furthermore, some of the inequalities obtained may be superseded by stronger inequalities that have yet to be found. Indeed, we show that the Zhang-Yeung inequality is one of those that is superseded. We also present several infinite families of inequalities. This list includes some, but not all of the infinite families found by Matus. Then we will give a description of what additional information these inequalities tell us about entropy space. This will include a conjecture on the maximum possible failure of Ingleton's inequality. Finally, we will present an application of non-Shannon inequalities to network coding. We will demonstrate how these inequalities are useful in finding bounds on the information that can flow through a particular network called the Vamos network.
[1]
Alex J. Grant,et al.
Entropy Vectors and Network Codes
,
2007,
2007 IEEE International Symposium on Information Theory.
[2]
Frantisek Matús,et al.
Infinitely Many Information Inequalities
,
2007,
2007 IEEE International Symposium on Information Theory.
[3]
Zhen Zhang,et al.
A non-Shannon-type conditional inequality of information quantities
,
1997,
IEEE Trans. Inf. Theory.
[4]
Nikolai K. Vereshchagin,et al.
A new class of non-Shannon-type inequalities for entropies
,
2002,
Commun. Inf. Syst..
[5]
Sang Joon Kim,et al.
A Mathematical Theory of Communication
,
2006
.
[6]
Weidong Xu,et al.
A projection method for derivation of non-Shannon-type information inequalities
,
2008,
2008 IEEE International Symposium on Information Theory.
[7]
Radim Lněnička,et al.
On the tightness of the Zhang-Yeung inequality for Gaussian vectors
,
2003,
Commun. Inf. Syst..
[8]
Randall Dougherty,et al.
Six New Non-Shannon Information Inequalities
,
2006,
2006 IEEE International Symposium on Information Theory.
[9]
Raymond W. Yeung,et al.
A First Course in Information Theory
,
2002
.
[10]
Zhen Zhang,et al.
On a new non-Shannon-type information inequality
,
2002,
Proceedings IEEE International Symposium on Information Theory,.
[11]
Randall Dougherty,et al.
Networks, Matroids, and Non-Shannon Information Inequalities
,
2007,
IEEE Transactions on Information Theory.
[12]
Zhen Zhang,et al.
On Characterization of Entropy Function via Information Inequalities
,
1998,
IEEE Trans. Inf. Theory.