k Decision Trees

Consider a wireless sensor network in which each node possesses a bit of information. Suppose all sensors with the bit 1 broadcast this fact to a central processor. If zero or one sensors broadcast, the central processor can detect this fact. If two or more sensors broadcast, the central processor can only detect that there is a “collision.” Although collisions may seem to be a nuisance, they can in some cases help the central processor compute an aggregate function of the sensors’ data. Motivated by this scenario, we study a new model of computation for boolean functions: the 2 decision tree. This model is an augmentation of the standard decision tree model: now each internal node queries an arbitrary set of literals and branches on whether 0, 1, or at least 2 of the literals are true. This model was suggested in a work of Ben-Asher and Newman but does not seem to have been studied previously. Our main result shows that 2 decision trees can “count” rather effectively. Specifically, we show that zero-error 2 decision trees can compute the threshold-of-t symmetric function with O(t) expected queries (and that Ω(t) is a lower bound even for two-sided error 2 decision trees). Interestingly, this feature is not shared by 1 decision trees, demonstrating that “collisions can help.” Our result implies that the natural generalization to k decision trees does not give much more power than 2 decision trees. We also prove a lower bound of Ω̃(t) · log(n/t) for the deterministic 2 complexity of the threshold-of-t function, demonstrating that the randomized 2 complexity can in some cases be unboundedly better than deterministic 2 complexity. Finally, we generalize the above results to arbitrary symmetric functions, and we discuss the relationship between k decision trees and other complexity notions such as decision tree rank and communication complexity.

[1]  H. Buhrman,et al.  Complexity measures and decision tree complexity: a survey , 2002, Theor. Comput. Sci..

[2]  Noam Nisan,et al.  CREW PRAMS and decision trees , 1989, STOC '89.

[3]  D. Du,et al.  Combinatorial Group Testing and Its Applications , 1993 .

[4]  Ramamohan Paturi,et al.  On the degree of polynomials that approximate symmetric Boolean functions (preliminary version) , 1992, STOC '92.

[5]  Murat Demirbas,et al.  Consensus and collision detectors in wireless Ad Hoc networks , 2005, PODC '05.

[6]  A. Razborov Communication Complexity , 2011 .

[7]  Yosi Ben-Asher,et al.  Decision Trees with Boolean Threshold Queries , 1995, J. Comput. Syst. Sci..

[8]  Eyal Kushilevitz,et al.  Learning decision trees using the Fourier spectrum , 1991, STOC '91.

[9]  Michael Ben-Or,et al.  Lower bounds for algebraic computation trees , 1983, STOC.

[10]  Murat Demirbas,et al.  A Singlehop Collaborative Feedback Primitive for Wireless Sensor Networks , 2008, IEEE INFOCOM 2008 - The 27th Conference on Computer Communications.

[11]  Richard Beigel Perceptrons, PP, and the polynomial hierarchy , 2005, computational complexity.

[12]  Arnold L. Rosenberg On the time required to recognize properties of graphs: a problem , 1973, SIGA.

[13]  Shlomo Moran,et al.  Applications of Ramsey's theorem to decision tree complexity , 1985, JACM.

[14]  K. Hamza The smallest uniform upper bound on the distance between the mean and the median of the binomial and Poisson distributions , 1995 .

[15]  Desh Ranjan,et al.  Balls and bins: A study in negative dependence , 1996, Random Struct. Algorithms.

[16]  Andrew Chi-Chih Yao Monotone Bipartite Graph Properties are Evasive , 1988, SIAM J. Comput..

[17]  David Haussler,et al.  Learning decision trees from random examples , 1988, COLT '88.

[18]  Richard J. Lipton,et al.  Multidimensional Searching Problems , 1976, SIAM J. Comput..

[19]  Laxmikant V. Kalé,et al.  Combinatorial Search , 2011, Encyclopedia of Parallel Computing.

[20]  Nader H. Bshouty A Subexponential Exact Learning Algorithm for DNF Using Equivalence Queries , 1996, Inf. Process. Lett..