Examples of learning curves from a modified VC-formalism

We examine the issue of evaluation of model specific parameters in a modified VC-formalism. Two examples are analyzed: the 2-dimensional homogeneous perceptron and the 1-dimensional higher order neuron. Both models are solved theoretically, and their learning curves are compared against true learning curves. It is shown that the formalism has the potential to generate a variety of learning curves, including ones displaying "phase transitions."

[1]  David Haussler,et al.  Learnability and the Vapnik-Chervonenkis dimension , 1989, JACM.

[2]  Sompolinsky,et al.  Statistical mechanics of learning from examples. , 1992, Physical review. A, Atomic, molecular, and optical physics.

[3]  Martin Anthony,et al.  Computational learning theory: an introduction , 1992 .

[4]  A. Sakurai,et al.  n-h-1 networks store no less n*h+1 examples, but sometimes no more , 1992, [Proceedings 1992] IJCNN International Joint Conference on Neural Networks.

[5]  Shun-ichi Amari,et al.  Four Types of Learning Curves , 1992, Neural Computation.

[6]  Adam Kowalczyk,et al.  Generalisation in Feedforward Networks , 1994, NIPS.

[7]  Yann LeCun,et al.  Measuring the VC-Dimension of a Learning Machine , 1994, Neural Computation.

[8]  D. Haussler,et al.  Rigorous learning curve bounds from statistical mechanics , 1994, COLT '94.

[9]  Adam Kowalczyk,et al.  Learning curves from a modified VC-formalism: a case study , 1995, Proceedings of ICNN'95 - International Conference on Neural Networks.

[10]  Adam Kowalczyk,et al.  Estimates of Storage Capacity of Multilayer Perceptron with Threshold Logic Hidden Units , 1997, Neural Networks.

[11]  S. Amari,et al.  LARGE SCALE SIMULATIONS FOR LEARNING CURVES , 2022 .