The effects of analog hardware properties on backpropagation networks with on-chip learning

Results of simulations performed assuming both forward and backward computation done on-chip using analog components are presented. Aspects of analog hardware studied are component variability (variability in multiplier gains and zero offsets), limited voltage ranges, and components (multipliers) that only approximate the computations in the backpropagation algorithm. It is shown that backpropagation networks can learn to compensate for all these shortcomings of analog circuits except for zero offsets. Variability in multiplier gains is not a problem, and learning is still possible despite limited voltage ranges and function approximations. Fixed component variation from fabrication is shown to be less detrimental to learning than component variation due to noise.<<ETX>>

[1]  Edward A. Rietman,et al.  Back-propagation learning and nonidealities in analog neural network hardware , 1991, IEEE Trans. Neural Networks.

[2]  Barrie Gilbert A high-performance monolithic multiplier using active feedback , 1974 .

[3]  T. Kohonen,et al.  Statistical pattern recognition with neural networks: benchmarking studies , 1988, IEEE 1988 International Conference on Neural Networks.

[4]  Jerzy B. Lont Analog CMOS implementation of a multi-layer perceptron with nonlinear synapses , 1992, IEEE Trans. Neural Networks.

[5]  Bing J. Sheu,et al.  VLSI design of compact and high-precision analog neural network processors , 1992, [Proceedings 1992] IJCNN International Joint Conference on Neural Networks.

[6]  C. Schneider,et al.  CMOS implementation of analog Hebbian synaptic learning circuits , 1991, IJCNN-91-Seattle International Joint Conference on Neural Networks.

[7]  Scott E. Fahlman,et al.  An empirical study of learning speed in back-propagation networks , 1988 .