Local Algorithms for Pattern Recognition and Dependencies Estimation

In previous publications (Bottou and Vapnik 1992; Vapnik 1992) we described local learning algorithms, which result in performance improvements for real problems. We present here the theoretical framework on which these algorithms are based. First, we present a new statement of certain learning problems, namely the local risk minimization. We review the basic results of the uniform convergence theory of learning, and extend these results to local risk minimization. We also extend the structural risk minimization principle for both pattern recognition problems and regression problems. This extended induction principle is the basis for a new class of algorithms.

[1]  Vladimir Vapnik,et al.  Principles of Risk Minimization for Learning Theory , 1991, NIPS.

[2]  Léon Bottou,et al.  Local Learning Algorithms , 1992, Neural Computation.