A digital 'snake' implementation of the back-propagation neural network

A digital implementation of a multilayer neural network model that has backpropagation as its learning algorithm is presented. This architecture is characterized by a set of elementary processors (neurons) and has the form of a linear sequence, where every processor communicates only with its two nearest neighbors. A sophisticated control of data exchange among neurons, by means of two data buses, ensures full pipelining in the forward mode. The proposed architecture is very flexible since, having only local connections, it can be easily expanded by simply adjoining more processors to it. Moreover, it can be programmed in terms of number and width of layers.<<ETX>>

[1]  S. Y. Kung,et al.  Parallel architectures for artificial neural nets , 1988, IEEE 1988 International Conference on Neural Networks.

[2]  D. Roweth,et al.  Implementing Neural Network Models on Parallel Computers , 1987, Comput. J..