Constructing Higher Order Neurons of Increasing Complexity in Cascade Networks

A problem faced by many constructive neural networks using a cascade architecture is the large network depth. This results in large fan-in and propagation delays, problems especially relevant for VLSI implementation of these networks. This work explores the effect of limiting the depth of the cascades created by CasPer, a constructive cascade algorithm. Instead of a single cascade of hidden neurons, a series of cascade towers are built. Each cascade tower can be viewed as a single Higher Order Neuron (HON). The optimal complexity of the HON required for a given problem is difficult to estimate, and is a form of the bias-variance dilemma. This problem is overcome via the construction of HONs with increasing complexity. It is shown that by constructing HONs in this manner the chance of overfitting is reduced, especially with noisy data.

[1]  Jenq-Neng Hwang,et al.  Regression modeling in back-propagation and projection pursuit learning , 1994, IEEE Trans. Neural Networks.

[2]  Tamás D. Gedeon,et al.  A Cascade Network Algorithm Employing Progressive RPROP , 1997, IWANN.

[3]  Tamás D. Gedeon,et al.  Extending CasPer: A Regression Survey , 1997, ICONIP.

[4]  Christian Lebiere,et al.  The Cascade-Correlation Learning Architecture , 1989, NIPS.

[5]  Tom Gedeon,et al.  Exploring architecture variations in constructive cascade networks , 1998, 1998 IEEE International Joint Conference on Neural Networks Proceedings. IEEE World Congress on Computational Intelligence (Cat. No.98CH36227).

[6]  Martin A. Riedmiller,et al.  A direct adaptive method for faster backpropagation learning: the RPROP algorithm , 1993, IEEE International Conference on Neural Networks.