On Eeciently Monitoring the Learning Process of Feedforward Neural Networks

We propose a characteristic structure number interrelating the weight vectors of a feed-forward neural network. It allows the monitoring of the learning process of feedforward neural networks and the identiication of characteristic points/phases during the learning process. Some properties are given and results of applications to diierent networks are shown.

[1]  Sompolinsky,et al.  Learning from examples in large neural networks. , 1990, Physical review letters.

[2]  D. Rumelhart,et al.  The effective dimension of the space of hidden units , 1991, [Proceedings] 1991 IEEE International Joint Conference on Neural Networks.

[3]  John Moody,et al.  Learning rate schedules for faster stochastic gradient search , 1992, Neural Networks for Signal Processing II Proceedings of the 1992 IEEE Workshop.

[4]  Anne-Johan Annema,et al.  Learning behavior and temporary minima of two-layer neural networks , 1994, Neural Networks.

[5]  R. Lippmann,et al.  An introduction to computing with neural nets , 1987, IEEE ASSP Magazine.

[6]  Shun-ichi Amari,et al.  A universal theorem on learning curves , 1993, Neural Networks.