Neural networks-extraordinary variation

There is an extraordinary variation in the implementations of neural networks, ranging from electronic circuits, digital or analog, to novel devices and optical implementations. To provide a wide perspective of different developments, we selected four projects to summarize. These projects were originally presented at Microneuro 1994. The first two projects are optical and optoelectronic implementations. While the majority of neural nets are built in silicon, an active research community works on optical networks. The high connectivity of neural nets makes communication among the processors one of the main difficulties in implementation. With optical beams, a large number of elements can be addressed in parallel, making optics an attractive alternative to silicon. D.C. Burns and coauthors describe a combination of optics and electronics. The authors have built an optical input plane for a neural net so that whole images with tens of thousands of pixels can be entered into a network in parallel. S.R. Skinner, J.E. Steck, and E.C. Behrman present an all optical network, where optically nonlinear materials perform not only the communication but also the calculations. Learning remains a tricky problem for a hardware implementation, in particular for analog circuits. Popular learning algorithms, such as backpropagation, require a high computational resolution. How to implement learning techniques in low-resolution electronics is the subject of intense research. G. Cairns and L. Tarassenko address this issue by comparing the required precision for different learning schemes. The final project is a digital circuit implementing a self-organizing feature map, an unsupervised learning technique. In this example, computational resolution is also a major problem. Before building a chip, researchers experimented extensively to determine the minimum resolution required for good results. >