Impact of Weight Initialization on Training of Sigmoidal Ffann

During training one of the most important factor is weight initialization that affects the training speed of the neural network. In this paper we have used random and Nguyen-Widrow weight initialization along with the proposed weight initialization methods for training the FFANN. We have used various types of data sets as input. Five data sets are taken from UCI machine learning repository. We have used PROP Back-Propagation algorithms for training and testing. We have taken different number of inputs and hidden layer nodes with single output node for experimentation. We have found that in almost all the cases the proposed weight initialization method gives better results.

[1]  Simon Haykin,et al.  Neural Networks and Learning Machines , 2010 .

[2]  Vladimir Cherkassky,et al.  Comparison of adaptive methods for function estimation from samples , 1996, IEEE Trans. Neural Networks.

[3]  Simon Haykin,et al.  Neural Networks: A Comprehensive Foundation , 1998 .

[4]  Bernard Widrow,et al.  Improving the learning speed of 2-layer neural networks by choosing initial values of the adaptive weights , 1990, 1990 IJCNN International Joint Conference on Neural Networks.

[5]  Klaus-Robert Müller,et al.  Efficient BackProp , 2012, Neural Networks: Tricks of the Trade.

[6]  Jong Beom Ra,et al.  Weight value initialization for improving training speed in the backpropagation network , 1991, [Proceedings] 1991 IEEE International Joint Conference on Neural Networks.

[7]  Emile Fiesler,et al.  Neural Network Initialization , 1995, IWANN.

[8]  Vladimir Cherkassky,et al.  Regularization effect of weight initialization in back propagation networks , 1998, 1998 IEEE International Joint Conference on Neural Networks Proceedings. IEEE World Congress on Computational Intelligence (Cat. No.98CH36227).

[9]  Sandro Ridella,et al.  Statistically controlled activation weight initialization (SCAWI) , 1992, IEEE Trans. Neural Networks.

[10]  Halbert White,et al.  Approximating and learning unknown mappings using multilayer feedforward networks with bounded weights , 1990, 1990 IJCNN International Joint Conference on Neural Networks.

[11]  Pravin Chandra,et al.  Interval based Weight Initialization Method for Sigmoidal Feedforward Artificial Neural Networks , 2014 .

[12]  Pravin Chandra,et al.  Comparison of sigmoidal FFANN training algorithms for function approximation problems , 2015, 2015 2nd International Conference on Computing for Sustainable Global Development (INDIACom).

[13]  Martin A. Riedmiller,et al.  A direct adaptive method for faster backpropagation learning: the RPROP algorithm , 1993, IEEE International Conference on Neural Networks.