A recursive least squares algorithm for evolution and learning by an optimal interpolative net

An evolutionary learning algorithm is presented for the optimal interpolative net proposed by R.J.P. de Figueiredo (1990). The algorithm is based on a recursive least squares training procedure. Sigmoidal functions more general than the pure exponential one considered previously are discussed. One of the key attributes of the present approach is that it incorporates in the structure of the net the smallest number of prototypes from the training set T which are necessary to correctly classify all the members of T. Thus, the net grows only to the degree of complexity that it needs in order to solve a given classification problem. It is shown how this approach avoids some of the difficulties posed by the backpropagation algorithm because of the latter's inflexible network architecture. The results are demonstrated by experiments with Iris data.<<ETX>>