Descending epsilon in back-propagation: a technique for better generalization

There are two measures for the optimality of a trained feedforward network for the given training patterns: the global error function and the correctness ratio. In the present work, the authors argue that these two measures are not parallel and present a technique (called descending epsilon) with which the back-propagation method results in a high correctness ratio. It is shown that, with this technique, the trained networks often exhibit high correctness ratios not only for the training patterns but also for novel patterns