Selective sampling, a realistic active learning model, has received recent attention in the learning theory literature. While the analysis of selective sampling is still in its infancy, we focus here on one of the (seemingly) simplest problems that. remain open. Given a pool of unlabeled examples, drawn i.i.d. from an arbitrary input distribution known to the learner, and oracle access to their labels, the objective is to achieve a target error-rate with minimum label-complexity, via an efficient algorithm. No prior distribution is assumed over the concept class, however the problem remains open even under the realizability assumption: there exists a target hypothesis in the concept class that perfectly classifies all examples, and the labeling oracle is noiseless. 1 As a precise variant of the problem, we consider the case of learning homogeneous half-spaces in the realizable setting: unlabeled examples, x t , are drawn i.i.d. from a known distribution D over the surface of the unit ball in R d and labels y t are either -1 or +1. The target function is a half-space u.x > 0 represented by a unit vector u ∈ R d such that y t (u.x t ) > 0 for all t. We denote a hypothesis v's prediction as v(x) = SGN(υ .x). Problem: Provide an algorithm for active learning of half-spaces, such that (with high probability with respect to D and any internal randomness): 1. After L label queries, algorithm's hypothesis υ obeys P x∼D [υ(x)¬= u(x)] < e. 2. L is at most the PAC sample complexity of the supervised problem, O(d elog 1 e), and for a general class of input distributions, L is significantly lower. 2 3. Total running time is at most poly(d,1 e).
[1]
Philip M. Long.
An upper bound on the sample complexity of PAC-learning halfspaces with respect to the uniform distribution
,
2003,
Inf. Process. Lett..
[2]
H. Sebastian Seung,et al.
Query by committee
,
1992,
COLT '92.
[3]
Naftali Tishby,et al.
Query by Committee Made Real
,
2005,
NIPS.
[4]
William A. Gale,et al.
A sequential algorithm for training text classifiers
,
1994,
SIGIR '94.
[5]
Matti Kääriäinen,et al.
Active Learning in the Non-realizable Case
,
2006,
ALT.
[6]
David A. Cohn,et al.
Improving generalization with active learning
,
1994,
Machine Learning.
[7]
Dana Angluin,et al.
Queries revisited
,
2001,
Theoretical Computer Science.
[8]
Claudio Gentile,et al.
Worst-Case Analysis of Selective Sampling for Linear-Threshold Algorithms
,
2004,
NIPS.
[9]
Sanjoy Dasgupta,et al.
Coarse sample complexity bounds for active learning
,
2005,
NIPS.
[10]
H. Sebastian Seung,et al.
Selective Sampling Using the Query by Committee Algorithm
,
1997,
Machine Learning.
[11]
Adam Tauman Kalai,et al.
Analysis of Perceptron-Based Active Learning
,
2009,
COLT.
[12]
Philip M. Long.
On the sample complexity of PAC learning half-spaces against the uniform distribution
,
1995,
IEEE Trans. Neural Networks.