Capacity of Memory and Error Correction Capability in Chaotic Neural Networks with Incremental Learning

Neural networks are able to learn more patterns with the incremental learning than with the correlative learning. The incremental learning is a method to compose an associative memory using a chaotic neural network. In the former work, it was found that the capacity of the network increases along with its size, with some threshold value and that it decreases over that size. The threshold value and the capacity varied by two different learning parameters. In this paper, the capacity of the networks was investigated by changing the learning parameter. Through the computer simulations, it turned out that the capacity also increases in proportion to the network size and that the capacity of the network with the incremental learning is above 11 times larger than the one with correlative learning. The error correction capability is also estimated in 100 neuron network.

[1]  Kazuyuki Aihara,et al.  Automatic learning in chaotic neural networks , 1996 .

[2]  K. Aihara,et al.  Chaotic neural networks , 1990 .

[3]  Lakhmi C. Jain,et al.  Knowledge-Based Intelligent Information and Engineering Systems , 2004, Lecture Notes in Computer Science.

[4]  M. Watanabe,et al.  Automatic learning in chaotic neural network , 1994, ETFA '94. 1994 IEEE Symposium on Emerging Technologies and Factory Automation. (SEIKEN) Symposium) -Novel Disciplines for the Next Century- Proceedings.

[5]  Naohiro Ishii,et al.  On Refractory Parameter of Chaotic Neurons in Incremental Learning , 2004, KES.