A symbolic solution for adaptive feedforward neural networks found with a new training algorithm

Traditional adaptive feed forward neural network (NN) training algorithms find numerical values for the weights and thresholds. In this paper it is shown that a NN composed of linear threshold gates (LTGs) can function as a fully trained neural network without finding numerical values for the weights and thresholds. This surprising result is demonstrated by presenting a new training algorithm for this type of NN that resolves the network into constraints which describes all the numeric values the NN’s weights and thresholds can take. The constraints do not require a numerical solution for the network to function as a fully trained NN which can generalize. The solution is said to be symbolic as a numerical solution is not required.