A training algorithm for Adaptive Feedforward Neural Networks that determines its own topology

There has been some interest in developing neural network training algorithms that determine their own architecture. A training algorithm for adaptive feedforward neural networks (NN) composed of Linear Threshold Gates (LTGs) is presented here that determines it’s own architecture and trains in a single pass. This algorithm produces what is said to be a symbolic solution as it resolves the relationships between the weights and the thresholds into constraints which do not require to be solved numerically. The network has been shown to behave as a fully trained neural network which generalizes and the possibility that the algorithm has polynomial time complexity is discussed. The algorithm uses binary data during training and overcomes the local minima problem that backpropagation experiences.