Conditional Information Inequalities for Entropic and Almost Entropic Points

We study conditional linear information inequalities, i.e., linear inequalities for Shannon entropy that hold for distributions whose joint entropies meet some linear constraints. We prove that some conditional information inequalities cannot be extended to any unconditional linear inequalities. Some of these conditional inequalities hold for almost entropic points, while others do not. We also discuss some counterparts of conditional information inequalities for Kolmogorov complexity.

[1]  Jan Pospíšil,et al.  ACADEMY OF SCIENCES OF THE CZECH REPUBLIC , 2007 .

[2]  Amos Beimel,et al.  Secret Sharing and Non-Shannon Information Inequalities , 2011, IEEE Transactions on Information Theory.

[3]  Nikolai K. Vereshchagin,et al.  Inequalities for Shannon Entropy and Kolmogorov Complexity , 1997, J. Comput. Syst. Sci..

[4]  Andrei E. Romashchenko,et al.  Conditional and unconditional information inequalities: an algebraic example , 2012, ArXiv.

[5]  Andrei E. Romashchenko,et al.  On the non-robustness of essentially conditional information inequalities , 2012, 2012 IEEE Information Theory Workshop.

[6]  Terence Chan Recent Progresses in Characterising Information Inequalities , 2011, Entropy.

[7]  Imre Csiszár,et al.  Information Theory - Coding Theorems for Discrete Memoryless Systems, Second Edition , 2011 .

[8]  Zhen Zhang,et al.  On Characterization of Entropy Function via Information Inequalities , 1998, IEEE Trans. Inf. Theory.

[9]  Alex J. Grant,et al.  Non-linear Information Inequalities , 2008, Entropy.

[10]  Frantisek Matús,et al.  Piecewise linear conditional information inequality , 2006, IEEE Transactions on Information Theory.

[11]  Frantisek Matús,et al.  Infinitely Many Information Inequalities , 2007, 2007 IEEE International Symposium on Information Theory.

[12]  Nikolai K. Vereshchagin,et al.  A new class of non-Shannon-type inequalities for entropies , 2002, Commun. Inf. Syst..

[13]  L. Levin,et al.  THE COMPLEXITY OF FINITE OBJECTS AND THE DEVELOPMENT OF THE CONCEPTS OF INFORMATION AND RANDOMNESS BY MEANS OF THE THEORY OF ALGORITHMS , 1970 .

[14]  František Matúš Conditional Independences among Four Random Variables III: Final Conclusion , 1999 .

[15]  Andrei E. Romashchenko,et al.  On essentially conditional information inequalities , 2011, 2011 IEEE International Symposium on Information Theory Proceedings.

[16]  Raymond W. Yeung,et al.  On a relation between information inequalities and group theory , 2002, IEEE Trans. Inf. Theory.

[17]  Jack K. Wolf,et al.  Noiseless coding of correlated information sources , 1973, IEEE Trans. Inf. Theory.

[18]  Thomas M. Cover,et al.  Elements of Information Theory , 2005 .

[19]  K. Kreutz-Delgado,et al.  - Finite-Dimensional Vector Spaces , 2018, Physical Components of Tensors.

[20]  Jessica Ruth Metcalf-Burton Improved upper bounds for the information rates of the secret sharing schemes induced by the Vámos matroid , 2011, Discret. Math..

[21]  F. Matús,et al.  Two Constructions on Limits of Entropy Functions , 2007, IEEE Transactions on Information Theory.

[22]  Randall Dougherty,et al.  Six New Non-Shannon Information Inequalities , 2006, 2006 IEEE International Symposium on Information Theory.

[23]  Raymond W. Yeung,et al.  A First Course in Information Theory , 2002 .

[24]  Ming Li,et al.  An Introduction to Kolmogorov Complexity and Its Applications , 2019, Texts in Computer Science.

[25]  F. Mattt,et al.  Conditional Independences among Four Random Variables Iii: Final Conclusion , 1999 .

[26]  Andrej Muchnik,et al.  Conditional complexity and codes , 2002, Theor. Comput. Sci..

[27]  Alfredo De Santis,et al.  On the size of shares for secret sharing schemes , 1991, Journal of Cryptology.

[28]  Zhen Zhang,et al.  A non-Shannon-type conditional inequality of information quantities , 1997, IEEE Trans. Inf. Theory.

[29]  Amos Beimel,et al.  Secret Sharing and Non-Shannon Information Inequalities , 2009, TCC.

[30]  Milan Studený,et al.  Conditional Independences among Four Random Variables 1 , 1995, Comb. Probab. Comput..