Predicting (0, 1)-functions on randomly drawn points

The authors consider the problem of predicting (0, 1)-valued functions on R/sup n/ and smaller domains, based on their values on randomly drawn points. Their model is related to L.G. Valiant's learnability model (1984), but does not require the hypotheses used for prediction to be represented in any specified form. The authors first disregard computational complexity and show how to construct prediction strategies that are optimal to within a constant factor for any reasonable class F of target functions. These prediction strategies use the 1-inclusion graph structure from N. Alon et al.'s work on geometric range queries (1987) to minimize the probability of incorrect prediction. They then turn to computationally efficient algorithms. For indicator functions of axis-parallel rectangles and halfspaces in R/sup n/, they demonstrate how their techniques can be applied to construct computational efficient prediction strategies that are optimal to within a constant factor. They compare the general performance of prediction strategies derived by their method to those derived from existing methods in Valiant's learnability theory.<<ETX>>

[1]  Vladimir Vapnik,et al.  Chervonenkis: On the uniform convergence of relative frequencies of events to their probabilities , 1971 .

[2]  Norbert Sauer,et al.  On the Density of Families of Sets , 1972, J. Comb. Theory, Ser. A.

[3]  Temple F. Smith Occam's razor , 1980, Nature.

[4]  Leslie G. Valiant,et al.  A theory of the learnable , 1984, STOC '84.

[5]  Narendra Karmarkar,et al.  A new polynomial-time algorithm for linear programming , 1984, Comb..

[6]  David Haussler,et al.  Classifying learnable geometric concepts with the Vapnik-Chervonenkis dimension , 1986, STOC '86.

[7]  David Haussler,et al.  Epsilon-nets and simplex range queries , 1986, SCG '86.

[8]  Leslie G. Valiant,et al.  On the learnability of Boolean formulae , 1987, STOC.

[9]  David Haussler,et al.  Occam's Razor , 1987, Inf. Process. Lett..

[10]  N. Littlestone Learning Quickly When Irrelevant Attributes Abound: A New Linear-Threshold Algorithm , 1987, 28th Annual Symposium on Foundations of Computer Science (sfcs 1987).

[11]  Noga Alon,et al.  Partitioning and geometric embedding of range spaces of finite Vapnik-Chervonenkis dimension , 1987, SCG '87.

[12]  David Haussler,et al.  ɛ-nets and simplex range queries , 1987, Discret. Comput. Geom..

[13]  Emo Welzl,et al.  Partition trees for triangle counting and other range searching problems , 1988, SCG '88.

[14]  Leonard Pitt,et al.  Reductions among prediction problems: on the difficulty of predicting automata , 1988, [1988] Proceedings. Structure in Complexity Theory Third Annual Conference.

[15]  David Haussler,et al.  Learning decision trees from random examples , 1988, COLT '88.

[16]  Leonidas J. Guibas,et al.  Implicitly representing arrangements of lines or segments , 2011, SCG '88.

[17]  Leslie G. Valiant,et al.  A general lower bound on the number of examples needed for learning , 1988, COLT '88.

[18]  Leonidas J. Guibas,et al.  The complexity of many faces in arrangements of lines of segments , 1988, SCG '88.

[19]  David Haussler,et al.  Equivalence of models for polynomial learnability , 1988, COLT '88.

[20]  David Haussler,et al.  Learnability and the Vapnik-Chervonenkis dimension , 1989, JACM.

[21]  Leonidas J. Guibas,et al.  The complexity of many cells in arrangements of planes and related problems , 1990, Discret. Comput. Geom..