On Capacity with Incremental Learning by Simplified Chaotic Neural Network

Chaotic behaviors are often shown in the biological brains. They are related strongly to the memory storage and learning in the chaotic neural networks. The incremental learning is a method to compose an associative memory using a chaotic neural network and provides larger capacity than the Hebbian rule in compensation for amount of computation. In the former works, patterns were generated randomly to have plus 1 in half of elements and minus 1 in the others. When finely-tuned parameters were used, the network learned these pattern features, well. But, this result could be taken as an over-learning. Then, we proposed pattern generating methods to avoid over-learning and tested the patterns, in which the ratio of plus 1 and minus 1 is different from 1 to 1. In this paper, our simulations investigate the capacity of the usual chaotic neural network and that of the simplified chaotic neural network with these patterns to ensure no over-learning.

[1]  Naohiro Ishii,et al.  On Acceleration of Incremental Learning in Chaotic Neural Network , 2015, IWANN.

[2]  M. Watanabe,et al.  Automatic learning in chaotic neural network , 1994, ETFA '94. 1994 IEEE Symposium on Emerging Technologies and Factory Automation. (SEIKEN) Symposium) -Novel Disciplines for the Next Century- Proceedings.

[3]  Naohiro Ishii,et al.  On Capacity of Memory in Chaotic Neural Networks with Incremental Learning , 2008, KES.

[4]  Naohiro Ishii,et al.  On Refractory Parameter of Chaotic Neurons in Incremental Learning , 2004, KES.

[5]  Carlos Lourenço,et al.  Brain Chaos and Computation , 1996, Int. J. Neural Syst..

[6]  Roger Y. Lee,et al.  Proceedings of the ACIS 2nd International Conference on Software Engineering, Artificial Intelligence, Networking & Parallel/Distributed Computing, SNPD'01, august 20-22, 2001, Nagoya Institute of Technology, Japan , 2001 .

[7]  Naohiro Ishii,et al.  On Appropriate Refractoriness and Weight Increment in Incremental Learning , 2013, ICANNGA.

[8]  Chris Dobbyn,et al.  Chaos as a Desirable Stable State of Artificial Neural Networks , 1998, NC.

[9]  Naohiro Ishii,et al.  On Influence of Refractory Parameter in Incremental Learning , 2010, Computer and Information Science.

[10]  W. Freeman,et al.  Chaotic Oscillations and the Genesis of Meaning in Cerebral Cortex , 1994 .

[11]  Naohiro Ishii,et al.  On simplification of chaotic neural network on incremental learning , 2014, 15th IEEE/ACIS International Conference on Software Engineering, Artificial Intelligence, Networking and Parallel/Distributed Computing (SNPD).

[12]  Naohiro Ishii,et al.  On Memory Capacity in Incremental Learning with Appropriate Refractoriness and Weight Increment , 2011, 2011 First ACIS/JNU International Conference on Computers, Networks, Systems and Industrial Engineering.

[13]  Naohiro Ishii,et al.  Capacity of Memory and Error Correction Capability in Chaotic Neural Networks with Incremental Learning , 2009, Computer and Information Science.

[14]  K. Aihara,et al.  Chaotic neural networks , 1990 .

[15]  Naohiro Ishii,et al.  On Temporal Summation in Chaotic Neural Network with Incremental Learning , 2014, Int. J. Softw. Innov..