On Influence of Refractory Parameter in Incremental Learning

Neural networks are able to learn more patterns with the incremental learning than with the correlative learning. The incremental learning is a method to compose an associate memory using a chaotic neural network. The capacity of the network is found to increase along with its size which is the number of the neurons in the network and to be larger than the one with correlative learning. The appropriate learning parameter is in inverse proportion to the network size. But, in former work, the refractory parameter was fixed to one value, which gives the ability to reinforce memories. In this paper, the capacity of the networks was investigated changing the learning parameter and the refractory parameter. Through the computer simulations, it turned out that the capacity increases over the direct proportion to the network size.

[1]  Naohiro Ishii,et al.  Error Correction Capability in Chaotic Neural Networks , 2009, 2009 21st IEEE International Conference on Tools with Artificial Intelligence.

[2]  M. Watanabe,et al.  Automatic learning in chaotic neural network , 1994, ETFA '94. 1994 IEEE Symposium on Emerging Technologies and Factory Automation. (SEIKEN) Symposium) -Novel Disciplines for the Next Century- Proceedings.

[3]  Naohiro Ishii,et al.  On Capacity of Memory in Chaotic Neural Networks with Incremental Learning , 2008, KES.

[4]  Kazuyuki Aihara,et al.  Automatic learning in chaotic neural networks , 1996 .

[5]  K. Aihara,et al.  Chaotic neural networks , 1990 .

[6]  Naohiro Ishii,et al.  On Refractory Parameter of Chaotic Neurons in Incremental Learning , 2004, KES.

[7]  Lakhmi C. Jain,et al.  Knowledge-Based Intelligent Information and Engineering Systems , 2004, Lecture Notes in Computer Science.