Fourier Entropy-Influence Conjecture for Random Linear Threshold Functions

The Fourier-Entropy Influence (FEI) Conjecture states that for any Boolean function \(f:\{+1,-1\}^n \rightarrow \{+1,-1\}\), the Fourier entropy of f is at most its influence up to a universal constant factor. While the FEI conjecture has been proved for many classes of Boolean functions, it is still not known whether it holds for the class of Linear Threshold Functions. A natural question is: Does the FEI conjecture hold for a “random” linear threshold function? In this paper, we answer this question in the affirmative. We consider two natural distributions on the weights defining a linear threshold function, namely uniform distribution on \([-1,1]\) and Normal distribution.

[1]  J. Bourgain,et al.  Influences of Variables and Threshold Intervals under Group Symmetries , 1997 .

[2]  Ryan O'Donnell,et al.  Every decision tree has an influential variable , 2005, 46th Annual IEEE Symposium on Foundations of Computer Science (FOCS'05).

[3]  Satyanarayana V. Lokam,et al.  Upper bounds on Fourier entropy , 2015, Theor. Comput. Sci..

[4]  Ran Raz,et al.  The Spectrum of Small DeMorgan Formulas , 2012, Electron. Colloquium Comput. Complex..

[5]  Daniel M. Kane,et al.  The correct exponent for the Gotsman–Linial Conjecture , 2012, 2013 IEEE Conference on Computational Complexity.

[6]  Nathan Linial,et al.  The influence of variables on Boolean functions , 1988, [Proceedings 1988] 29th Annual Symposium on Foundations of Computer Science.

[7]  J. S. Hicks,et al.  An efficient method for generating uniformly distributed points on the surface of an n-dimensional sphere , 1959, CACM.

[8]  Bireswar Das,et al.  The Entropy Influence Conjecture Revisited , 2011, Electron. Colloquium Comput. Complex..

[9]  Mervin E. Muller,et al.  A note on a method for generating points uniformly on n-dimensional spheres , 1959, CACM.

[10]  William Feller,et al.  An Introduction to Probability Theory and Its Applications , 1951 .

[11]  Ryan O'Donnell,et al.  Analysis of Boolean Functions , 2014, ArXiv.

[12]  Adam Tauman Kalai,et al.  Agnostically learning decision trees , 2008, STOC.

[13]  Andrew Wan,et al.  Decision trees, protocols and the entropy-influence conjecture , 2014, ITCS.

[14]  G. Kalai,et al.  Every monotone graph property has a sharp threshold , 1996 .

[15]  S. Szarek On the best constants in the Khinchin inequality , 1976 .

[16]  Ryan O'Donnell,et al.  A Composition Theorem for the Fourier Entropy-Influence Conjecture , 2013, ICALP.

[17]  Andrew Wan,et al.  Mansour's Conjecture is True for Random DNF Formulas , 2010, COLT.

[18]  G. Marsaglia Choosing a Point from the Surface of a Sphere , 1972 .

[19]  Adam Tauman Kalai,et al.  A Query Algorithm for Agnostically Learning DNF? , 2008, COLT.

[20]  Gábor Lugosi,et al.  Concentration Inequalities - A Nonasymptotic Theory of Independence , 2013, Concentration Inequalities.

[21]  Ryan O'Donnell,et al.  The Fourier Entropy-Influence Conjecture for Certain Classes of Boolean Functions , 2011, ICALP.

[22]  Ravi B. Boppana,et al.  The Average Sensitivity of Bounded-Depth Circuits , 1997, Inf. Process. Lett..

[23]  H. N. Nagaraja,et al.  Order Statistics, Third Edition , 2005, Wiley Series in Probability and Statistics.

[24]  Ehud Friedgut,et al.  Boolean Functions With Low Average Sensitivity Depend On Few Coordinates , 1998, Comb..

[25]  Yishay Mansour,et al.  An O(n^(log log n)) Learning Algorithm for DNT under the Uniform Distribution , 1995, J. Comput. Syst. Sci..

[26]  Noam Nisan,et al.  Constant depth circuits, Fourier transform, and learnability , 1993, JACM.

[27]  R. Raz,et al.  On the Noise Stability of Small De Morgan Formulas , 2013 .

[28]  Herbert A. David,et al.  Order Statistics, Third Edition , 2003, Wiley Series in Probability and Statistics.