Phase transitions in soft-committee machines
暂无分享,去创建一个
Equilibrium statistical physics is applied to the off-line training of layered neural networks with differentiable activation functions. A first analysis of soft-committee machines with an arbitrary number (K) of hidden units and continuous weights learning a perfectly matching rule is performed. Our results are exact in the limit of high training temperatures (β → 0). For K = 2 we find a second-order phase transition from unspecialized to specialized student configurations at a critical size P of the training set, whereas for K ≥ 3 the transition is first order. The limit K → ∞ can be performed analytically, the transition occurs after presenting on the order of NK/β examples. However, an unspecialized metastable state persists up to P NK2/β.
[1] Franklin A. Graybill,et al. Introduction to The theory , 1974 .
[2] Anders Krogh,et al. Introduction to the theory of neural computation , 1994, The advanced book program.
[3] David Saad,et al. On-Line Learning in Neural Networks , 1999 .