On the query complexity of learning

We consider the problem of learning parametrized concept classes with membership and equivalence queries. If C. is the concept class being learned, we show that if equivalence queries can be made from a larger but still ‘reasonable’ hypothesis class, then there exist O(n log lC~ I) queries that exactly learn the target concept c E Cn. We also show that our results are best possible in terms of how big the hypothesis class needs to be by appealing to a result of Boppana[3].

[1]  Dana Angluin Negative results for equivalence queries , 1990, Mach. Learn..

[2]  Vasek Chvátal Probabilistic methods in graph theory , 1984, Ann. Oper. Res..

[3]  Leslie G. Valiant,et al.  A theory of the learnable , 1984, CACM.

[4]  Ravi B. Boppana,et al.  Amplification of probabilistic boolean formulas , 1985, 26th Annual Symposium on Foundations of Computer Science (sfcs 1985).

[5]  Leslie G. Valiant,et al.  Short Monotone Formulae for the Majority Function , 1984, J. Algorithms.

[6]  J. Spencer Ten lectures on the probabilistic method , 1987 .

[7]  Michael Kearns,et al.  Computational complexity of machine learning , 1990, ACM distinguished dissertations.

[8]  D. Angluin Queries and Concept Learning , 1988 .