Effect of number of hidden neurons on learning in large-scale layered neural networks

In order to provide a guideline about the number of hidden neurons N<sup>(h)</sup> and learning rate η for large-scale neural networks from the viewpoint of stable learning, the authors try to formulate the boundary of stable learning roughly, and to adjust it to the actual learning results of random number mapping problems. It is confirmed in the simulation that the hidden-output connection weights become small as the number of hidden neurons becomes large, and also that the trade-off in the learning stability between input-hidden and hidden-output connections exists. Finally, two equations N<sup>(h)</sup> = ∑N<sup>(i)</sup> N<sup>(o)</sup> and η = 32 / ∑N<sup>(i)</sup>N<sup>(o)</sup> are roughly introduced where N<sup>(i)</sup> and N<sup>(o)</sup> are the number of input and output neurons respectively even though further adjustment is necessary for other problems or conditions.