Descending epsilon in back-propagation: a technique for better generalization
暂无分享,去创建一个
There are two measures for the optimality of a trained feedforward network for the given training patterns: the global error function and the correctness ratio. In the present work, the authors argue that these two measures are not parallel and present a technique (called descending epsilon) with which the back-propagation method results in a high correctness ratio. It is shown that, with this technique, the trained networks often exhibit high correctness ratios not only for the training patterns but also for novel patterns
[1] Robert B. Allen,et al. Several Studies on Natural Language ·and Back-Propagation , 1987 .
[2] Kurt Hornik,et al. Multilayer feedforward networks are universal approximators , 1989, Neural Networks.
[3] Geoffrey E. Hinton,et al. Learning internal representations by error propagation , 1986 .