Second Order Properties of Error Surfaces: Learning Time and Generalization
暂无分享,去创建一个
The learning time of a simple neural network model is obtained through an analytic computation of the eigenvalue spectrum for the Hessian matrix, which describes the second order properties of the cost function in the space of coupling coefficients. The form of the eigenvalue distribution suggests new techniques for accelerating the learning process, and provides a theoretical justification for the choice of centered versus biased state variables.
[1] Bernard Widrow,et al. Adaptive Signal Processing , 1985 .
[2] Geoffrey E. Hinton,et al. Learning representations by back-propagating errors , 1986, Nature.
[3] S. Thomas Alexander,et al. Adaptive Signal Processing , 1986, Texts and Monographs in Computer Science.
[4] Anders Krogh,et al. Dynamics of Generalization in Linear Perceptrons , 1990, NIPS.
[5] Kanter,et al. Eigenvalues of covariance matrices: Application to neural-network learning. , 1991, Physical review letters.