Analysis of the Initial Values in Split-Complex Backpropagation Algorithm

When a multilayer perceptron (MLP) is trained with the split-complex backpropagation (SCBP) algorithm, one observes a relatively strong dependence of the performance on the initial values. For the effective adjustments of the weights and biases in SCBP, we propose that the range of the initial values should be greater than that of the adjustment quantities. This criterion can reduce the misadjustment of the weights and biases. Based on the this criterion, the suitable range of the initial values can be estimated. The results show that the suitable range of the initial values depends on the property of the used communication channel and the structure of the MLP (the number of layers and the number of nodes in each layer). The results are studied using the equalizer scenarios. The simulation results show that the estimated range of the initial values gives significantly improved performance.

[1]  Guang-Bin Huang,et al.  Learning capability and storage capacity of two-hidden-layer feedforward networks , 2003, IEEE Trans. Neural Networks.

[2]  Sammy Siu,et al.  Sensitivity Analysis of the Split-Complex Valued Multilayer Perceptron Due to the Errors of the i.i.d. Inputs and Weights , 2007, IEEE Transactions on Neural Networks.

[3]  Kang Zhang,et al.  Computation of Adalines' sensitivity to weight perturbation , 2006, IEEE Transactions on Neural Networks.

[4]  S. Thomas Alexander,et al.  Adaptive Signal Processing , 1986, Texts and Monographs in Computer Science.

[5]  R. Lippmann,et al.  An introduction to computing with neural nets , 1987, IEEE ASSP Magazine.

[6]  Sandro Ridella,et al.  Statistically controlled activation weight initialization (SCAWI) , 1992, IEEE Trans. Neural Networks.

[7]  S. Siu Norm Back Propagation Algorithm for Adaptive Equalization , 2004 .

[8]  Alfred Jean Philippe Lauret,et al.  A node pruning algorithm based on a Fourier amplitude sensitivity test method , 2006, IEEE Transactions on Neural Networks.

[9]  Etienne Barnard,et al.  Avoiding false local minima by proper initialization of connections , 1992, IEEE Trans. Neural Networks.

[10]  S. Osowski,et al.  New approach to selection of initial values of weights in neural function approximation , 1993 .

[11]  S. Haykin,et al.  Adaptive Filter Theory , 1986 .

[12]  Yih-Fang Huang,et al.  Bounds on the number of hidden neurons in multilayer perceptrons , 1991, IEEE Trans. Neural Networks.

[13]  Bernard Widrow,et al.  Improving the learning speed of 2-layer neural networks by choosing initial values of the adaptive weights , 1990, 1990 IJCNN International Joint Conference on Neural Networks.

[14]  Saleem A. Kassam,et al.  Channel Equalization Using Adaptive Complex Radial Basis Function Networks , 1995, IEEE J. Sel. Areas Commun..

[15]  Robert A. Scholtz,et al.  Performance Analysis of , 1998 .

[16]  J. F. Shepanski Fast learning in artificial neural systems: multilayer perceptron training using optimal estimation , 1988, IEEE 1988 International Conference on Neural Networks.

[17]  Simon Haykin,et al.  Adaptive filter theory (2nd ed.) , 1991 .

[18]  Teresa Bernarda Ludermir,et al.  An Optimization Methodology for Neural Network Weights and Architectures , 2006, IEEE Transactions on Neural Networks.

[19]  Tommy W. S. Chow,et al.  A new method in determining initial weights of feedforward neural networks for training enhancement , 1997, Neurocomputing.

[20]  Tülay Adali,et al.  Complex backpropagation neural network using elementary transcendental activation functions , 2001, 2001 IEEE International Conference on Acoustics, Speech, and Signal Processing. Proceedings (Cat. No.01CH37221).

[21]  Tommy W. S. Chow,et al.  Determining initial weights of feedforward neural networks based on least squares method , 1995, Neural Processing Letters.

[22]  Timothy Masters,et al.  Practical neural network recipes in C , 1993 .

[23]  Shin'ichi Tamura,et al.  Capabilities of a four-layered feedforward neural network: four layers versus three , 1997, IEEE Trans. Neural Networks.

[24]  C.F.N. Cowan,et al.  Decision feedback equalization using neural network structures , 1989 .

[25]  Tommy W. S. Chow,et al.  Feedforward networks training speed enhancement by optimal initialization of the synaptic coefficients , 2001, IEEE Trans. Neural Networks.

[26]  S. Tamura,et al.  Capabilities of a three layer feedforward neural network , 1991, [Proceedings] 1991 IEEE International Joint Conference on Neural Networks.

[27]  T. Adalı,et al.  Fully complex backpropagation for constant envelope signal processing , 2000, Neural Networks for Signal Processing X. Proceedings of the 2000 IEEE Signal Processing Society Workshop (Cat. No.00TH8501).

[28]  Emile Fiesler,et al.  High-order and multilayer perceptron initialization , 1997, IEEE Trans. Neural Networks.

[29]  S. Siu,et al.  Performance analysis of the I/sub p/ norm back propagation algorithm for adaptive equalisation , 1993 .

[30]  Ching-Haur Chang,et al.  Decision feedback equalization using complex backpropagation algorithm , 1997, Proceedings of 1997 IEEE International Symposium on Circuits and Systems. Circuits and Systems in the Information Age ISCAS '97.

[31]  S. Siu,et al.  Decision feedback equalisation using neural network structures and performance comparison with standard architecture , 1990 .

[32]  Francesco Piazza,et al.  On the complex backpropagation algorithm , 1992, IEEE Trans. Signal Process..