The Fourier Entropy-Influence (FEI) Conjecture of Friedgut and Kalai states that ${\bf H}[f] \leq C \cdot {\bf I}[f]$ holds for every Boolean function $f$, where ${\bf H}[f]$ denotes the spectral entropy of $f$, ${\bf I}[f]$ is its total influence, and $C > 0$ is a universal constant. Despite significant interest in the conjecture it has only been shown to hold for some classes of Boolean functions such as symmetric functions and read-once formulas.
In this work, we prove the conjecture for extremal cases, functions with small influence and functions with high entropy. Specifically, we show that:
* FEI holds for the class of functions with ${\bf I}[f] \leq 2^{-cn}$ with the constant $C = 4 \cdot \frac{c+1}{c}$. Furthermore, proving FEI for a class of functions with ${\bf I}[f] \leq 2^{-s(n)}$ for some $s(n) = o(n)$ will imply FEI for the class of all Boolean functions.
* FEI holds for the class of functions with ${\bf H}[f] \geq cn$ with the constant $C = \frac{1 + c}{h^{-1}(c^2)}$. Furthermore, proving FEI for a class of functions with ${\bf H}[f] \geq s(n)$ for some $s(n) = o(n)$ will imply FEI for the class of all Boolean functions.
Additionally, we show that FEI holds for the class of functions with constant $\|\widehat{f}\|_1$, completing the results of Chakhraborty et al. that bounded the entropy of such functions. We also improve the result of Wan et al. for read-k decision trees, from ${\bf H}[f] \leq O(k) \cdot {\bf I}[f]$ to ${\bf H}[f] \leq O(\sqrt{k}) \cdot {\bf I}[f]$. Finally, we suggest a direction for proving FEI for read-k DNFs, and prove the Fourier Min-Entropy/Influence (FMEI) Conjecture for regular read-k DNFs.
[1]
J. Bourgain,et al.
Influences of Variables and Threshold Intervals under Group Symmetries
,
1997
.
[2]
Yishay Mansour,et al.
An O(n^(log log n)) Learning Algorithm for DNT under the Uniform Distribution
,
1995,
J. Comput. Syst. Sci..
[3]
Ryan O'Donnell,et al.
A Composition Theorem for the Fourier Entropy-Influence Conjecture
,
2013,
ICALP.
[4]
Satyanarayana V. Lokam,et al.
Upper bounds on Fourier entropy
,
2015,
Theor. Comput. Sci..
[5]
Elchanan Mossel,et al.
A note on the Entropy/Influence conjecture
,
2012,
Discret. Math..
[6]
Bireswar Das,et al.
The Entropy Influence Conjecture Revisited
,
2011,
Electron. Colloquium Comput. Complex..
[7]
Avishay Tal,et al.
Degree and Sensitivity: tails of two distributions
,
2016,
Electron. Colloquium Comput. Complex..
[8]
T. Sanders,et al.
Analysis of Boolean Functions
,
2012,
ArXiv.
[9]
Andrew Wan,et al.
Decision trees, protocols and the entropy-influence conjecture
,
2014,
ITCS.
[10]
G. Kalai,et al.
Every monotone graph property has a sharp threshold
,
1996
.
[11]
Yishay Mansour,et al.
Weakly learning DNF and characterizing statistical query learning using Fourier analysis
,
1994,
STOC '94.
[12]
Nathan Linial,et al.
The influence of variables on Boolean functions
,
1988,
[Proceedings 1988] 29th Annual Symposium on Foundations of Computer Science.
[13]
Adam Tauman Kalai,et al.
Agnostically learning decision trees
,
2008,
STOC.
[14]
Ryan O'Donnell,et al.
The Fourier Entropy-Influence Conjecture for Certain Classes of Boolean Functions
,
2011,
ICALP.
[15]
Andrew Wan,et al.
Mansour's Conjecture is True for Random DNF Formulas
,
2010,
COLT.