Bounds on the Number of Examples Needed for Learning Functions

We prove general lower bounds on the number of examples needed for learning function classes within different natural learning models which are related to pac-learning (and coincide with the pac-learning model of Valiant in the case of {0,1}-valued functions). The lower bounds are obtained by showing that all nontrivial function classes contain a "hard binary-valued subproblem." Although (at first glance) it seems to be likely that real-valued function classes are much harder to learn than their hardest binary-valued subproblem, we show that these general lower bounds cannot be improved by more than a logarithmic factor. This is done by discussing some natural function classes like nondecreasing functions or piecewise-smooth functions (the function classes that were discussed in [M. J. Kearns and R. E. Schapire, Proc. 31st Annual Symposium on the Foundations of Computer Science, IEEE Computer Society Press, Los Alamitos, CA, 1990, pp. 382--392, full version, J. Comput. System Sci., 48 (1994), pp. 464--497], [D. Kimber and P. M. Long, Proc. 5th Annual Workshop on Computational Learning Theory, ACM, New York, 1992, pp. 153--160]) with certain restrictions concerning their slope.