Cascade Correlation: An Incremental Tool for Function Approximation

In this paper we show the capability of the Cascade Correlation (CC) algorithm of finding, through an incremental procedure, a sum of weighted hyperbolic tangents, which approximate every function of practical interest to any desired degree of accuracy. The incremental algorithm works only on one-layer perceptron and it is, then, a way of solving the credit assignment problem. We show that the integrated squared error has a speed of convergence of order O(1/nh), where nh is the number of hidden neurons: numerical results, through computer simulation, agree with the theory. Our analysis shows that CC represents an efficient implementation of the Projection Pursuit algorithm.

[1]  Christian Lebiere,et al.  The Cascade-Correlation Learning Architecture , 1989, NIPS.

[2]  L. Jones Constructive approximations for neural networks by sigmoidal functions , 1990, Proc. IEEE.

[3]  Robert B. Ash,et al.  Information Theory , 2020, The SAGE International Encyclopedia of Mass Media and Society.

[4]  Sandro Ridella,et al.  Statistically controlled activation weight initialization (SCAWI) , 1992, IEEE Trans. Neural Networks.

[5]  Bernard Widrow,et al.  Improving the learning speed of 2-layer neural networks by choosing initial values of the adaptive weights , 1990, 1990 IJCNN International Joint Conference on Neural Networks.

[6]  Andrew R. Barron,et al.  Universal approximation bounds for superpositions of a sigmoidal function , 1993, IEEE Trans. Inf. Theory.