A Tight Upper Bound on the Mutual Information of Two Boolean Functions

Let (X,Y) be a doubly symmetric binary source. For n i.i.d. copies (X<sup>n</sup>,Y<sup>n</sup>) of (X,Y) we show that max[I(f(X<sup>n</sup>); g(Y<sup>n</sup>))]= I(X,Y), where the maximum is over all Boolean functions f, g: {0, 1}<sup>n</sup> → {0, 1}. This positively resolves a conjecture published by Kumar and Courtade in 2013.

[1]  W. Rudin Principles of mathematical analysis , 1964 .

[2]  Ryan O'Donnell,et al.  Analysis of Boolean Functions , 2014, ArXiv.

[3]  Omri Weinstein,et al.  An improved upper bound for the most informative boolean function conjecture , 2015, 2016 IEEE International Symposium on Information Theory (ISIT).

[4]  Thomas A. Courtade,et al.  Which Boolean functions are most informative? , 2013, 2013 IEEE International Symposium on Information Theory.

[5]  Martin Bossert,et al.  Canalizing Boolean Functions Maximize Mutual Information , 2012, IEEE Transactions on Information Theory.

[6]  Ryan O'Donnell,et al.  Remarks on the Most Informative Function Conjecture at fixed mean , 2015 .

[7]  R. Cooke Real and Complex Analysis , 2011 .

[8]  Abbas El Gamal,et al.  Network Information Theory , 2021, 2021 IEEE 3rd International Conference on Advanced Trends in Information Theory (ATIT).

[9]  Venkat Anantharam,et al.  On hypercontractivity and the mutual information between Boolean functions , 2013, 2013 51st Annual Allerton Conference on Communication, Control, and Computing (Allerton).

[10]  W. Rudin Real and complex analysis , 1968 .

[11]  Thomas M. Cover,et al.  Network Information Theory , 2001 .

[12]  Thomas A. Courtade,et al.  Which Boolean Functions Maximize Mutual Information on Noisy Inputs? , 2014, IEEE Transactions on Information Theory.

[13]  T. Sanders,et al.  Analysis of Boolean Functions , 2012, ArXiv.

[14]  Gerald Matz,et al.  Dictator functions maximize mutual information , 2018, The Annals of Applied Probability.