High order neural networks with reduced numbers of interconnection weights

A multilayered network with the first layer consisting of parabolic neurons (constrained second-order neurons) is proposed. Each parabolic neuron requires only N+2 interconnections (where N is the number of inputs), which is virtually the same as a linear neuron. Such a network needs fewer neurons or layers than standard backpropagation (BP) to solve the same problem, and it converges much faster. The training is achieved through the backpropagation learning law. It is concluded that incorporating parabolic neurons onto the BP network can significantly improve the network's performance without resulting in an explosive number of interconnection weights