Solving the N-bit parity problem using neural networks
暂无分享,去创建一个
In this letter, a constructive solution to the N-bit parity problem is provided with a neural network that allows direct connections between the input layer and the output layer. The present approach requires no training and adaptation, and thus it warrants the use of the simple threshold activation function for the output and hidden layer neurons. It is previously shown that this choice of activation function and network structure leads to several solutions for the 3-bit parity problem obtained using linear programming. One of the solutions for the 3-bit parity problem is then generalized to obtain a solution for the N-bit parity problem using left floor N/2 right floor hidden layer neurons. It is shown that through the choice of a "staircase" type activation function, the left floor N/2 right floor hidden layer neurons can be further combined into a single hidden layer neuron.
[1] David G. Stork,et al. How to solve the N-bit parity problem with two hidden units , 1992, Neural Networks.
[2] J. M. Minor. Parity with two layer feedforward nets , 1993, Neural Networks.
[3] D. G. Stork,et al. N-BIT PARITY NETWORKS. REPLY , 1993 .