Injecting Chaos in Feedforward Neural Networks

Chaos appears in many natural and artificial systems; accordingly, we propose a method that injects chaos into a supervised feed forward neural network (NN). The chaos is injected simultaneously in the learnable temperature coefficient of the sigmoid activation function and in the weights of the NN. This is functionally different from the idea of noise injection (NI) which is relatively distant from biological realism. We investigate whether chaos injection is more efficient than standard back propagation, adaptive neuron model, and NI algorithms by applying these techniques to different benchmark classification problems such as heart disease, glass, breast cancer, and diabetes identification, and time series prediction. In each case chaos injection is superior to the standard approaches in terms of generalization ability and convergence rate. The performance of the proposed method is also statistically different from that of noise injection.

[1]  R. Costantino,et al.  Can noise induce chaos , 2003 .

[2]  Lipo Wang,et al.  A noisy chaotic neural network for solving combinatorial optimization problems: stochastic chaotic simulated annealing , 2004, IEEE Transactions on Systems, Man, and Cybernetics, Part B (Cybernetics).

[3]  Andrew Chi-Sing Leung,et al.  On Weight-Noise-Injection Training , 2009, ICONIP.

[4]  K. Aihara,et al.  Chaotic neural networks , 1990 .

[5]  Hammadi Nait-Charif,et al.  Improving the Performance of Feedforward Neural Networks by Noise Injection into Hidden Neurons , 1998, J. Intell. Robotic Syst..

[6]  Tony R. Martinez,et al.  CB3: An Adaptive Error Function for Backpropagation Training , 2006, Neural Processing Letters.

[7]  Lipo Wang,et al.  Back-propagation with chaos , 2008, 2008 International Conference on Neural Networks and Signal Processing.

[8]  Guozhong An,et al.  The Effects of Adding Noise During Backpropagation Training on a Generalization Performance , 1996, Neural Computation.

[9]  Thomas Serre,et al.  A Biologically Inspired System for Action Recognition , 2007, 2007 IEEE 11th International Conference on Computer Vision.

[10]  Kiyotoshi Matsuoka,et al.  Noise injection into inputs in back-propagation learning , 1992, IEEE Trans. Syst. Man Cybern..

[11]  W. Gareth J. Howells,et al.  A Novel Weightless Artificial Neural Based Multi-Classifier for Complex Classifications , 2010, Neural Processing Letters.

[12]  Stamatis Vassiliadis,et al.  On Chaos and Neural Networks: The Backpropagation Paradigm , 2001, Artificial Intelligence Review.

[13]  Kazuyuki Murase,et al.  Faster Training Using Fusion of Activation Functions for Feed Forward Neural Networks , 2009, Int. J. Neural Syst..

[14]  Christopher M. Bishop,et al.  Neural networks and machine learning , 1998 .

[15]  Chi-Chung Cheung,et al.  Magnified gradient function with deterministic weight modification in adaptive learning , 2004, IEEE Transactions on Neural Networks.

[16]  Hiroshi Nozawa,et al.  A neural network model as a globally coupled map and applications based on chaos. , 1992, Chaos.

[17]  Songcan Chen,et al.  Distance-Based Sparse Associative Memory Neural Network Algorithm for Pattern Recognition , 2006, Neural Processing Letters.

[18]  Kazuyuki Aihara,et al.  Deterministic prediction and chaos in squid axon response , 1992 .

[19]  J. Sprott Chaos and time-series analysis , 2001 .

[20]  Raoul Tawel Does the Neuron "Learn" Like the Synapse? , 1988, NIPS.

[21]  Mohamed Chtourou,et al.  A Hybrid Training Algorithm for Feedforward Neural Networks , 2006, Neural Processing Letters.