Department of Physics, Nara Women's University, Nara 630, Japan: Learning from stochastic rules under finite temperature - optimal temperature and asymptotic learning curve

In learning under external disturbance, it is expected that some tolerance in the system will optimize the learning process. In this paper, we give one example of this in learning from stochastic rules by the Gibbs algorithm. Using the replica method, we show that for the case of output noise, there exists an optimal temperature at which the generalization error is a minimum. This temperature exists even in the limit of large training sets and is determined by the stable replica symmetric solution. On the other hand, for other types of noise no such temperature exists and the asymptotic behaviour is determined by the one-step replica symmetric breaking solution. Further, the asymptotic expressions for learning curves are derived. They are precisely the same as those for the minimum-error algorithm.