Towards minimal network architectures with evolutionary growth networks

This paper points out how simple learning rules such as perceptron and delta can be re-introduced as local learning techniques to yield an effective automatic network construction algorithm. This feat is accomplished by choosing the right training set during network construction. The choice of partitions can have profound affects on the quality of the created networks, in terms of number of hidden units and connections. Selection of partitions during various network construction phases is achieved by means of evolutionary processes. Empirical evidence underlining the effectiveness of this approach is provided for several well known benchmark problems such as parity, encoder and adder functions.<<ETX>>