On Learning Parameters of Incremental Learning in Chaotic Neural Network

The incremental learning is a method to compose an associate memory using a chaotic neural network and provides larger capacity than correlative learning in compensation for a large amount of computation. A chaotic neuron has spatio-temporal sum in it and the temporal sum makes the learning stable to input noise. When there is no noise in input, the neuron may not need temporal sum. In this paper, to reduce the computations, a simplified network without temporal sum is introduced and investigated through the computer simulations comparing with the network as in the past. Then, to shorten the learning steps, the learning parameters are changed during the learning along 3 functions.

[1]  Naohiro Ishii,et al.  On Acceleration of Incremental Learning in Chaotic Neural Network , 2015, IWANN.

[2]  M. Watanabe,et al.  Automatic learning in chaotic neural network , 1994, ETFA '94. 1994 IEEE Symposium on Emerging Technologies and Factory Automation. (SEIKEN) Symposium) -Novel Disciplines for the Next Century- Proceedings.

[3]  Naohiro Ishii,et al.  On Capacity of Memory in Chaotic Neural Networks with Incremental Learning , 2008, KES.

[4]  Naohiro Ishii,et al.  On simplification of chaotic neural network on incremental learning , 2014, 15th IEEE/ACIS International Conference on Software Engineering, Artificial Intelligence, Networking and Parallel/Distributed Computing (SNPD).

[5]  Naohiro Ishii,et al.  On Refractory Parameter of Chaotic Neurons in Incremental Learning , 2004, KES.

[6]  Roger Y. Lee,et al.  Proceedings of the ACIS 2nd International Conference on Software Engineering, Artificial Intelligence, Networking & Parallel/Distributed Computing, SNPD'01, august 20-22, 2001, Nagoya Institute of Technology, Japan , 2001 .

[7]  K. Aihara,et al.  Chaotic neural networks , 1990 .

[8]  Naohiro Ishii,et al.  On Appropriate Refractoriness and Weight Increment in Incremental Learning , 2013, ICANNGA.