Neural network simulation at Warp speed: how we got 17 million connections per second

A fast back-propagation algorithm for a linear array of processors is described. Results of an implementation of this algorithm on Warp, a ten-processor, programmable systolic array computer, are reviewed and compared with back-propagation implementations on other machines. The current Warp simulator is about eight times faster at simulating the NETtalk text-to-speech network than the fastest back-propagation simulator previously reported in the literature. This fast simulator on Warp is being used routinely in a road-recognition experiment for robot navigation. Results indicate that linear systolic array machines can be efficient neural network simulators. Planned extensions and improvements to the current algorithm are discussed.<<ETX>>