A radial basis function neurocomputer implemented with analog VLSI circuits

An electronic neurocomputer which implements a radial basis function neural network (RBFNN) is described. The RBFNN is a network that utilizes a radial basis function as the transfer function. The key advantages of RBFNNs over existing neural network architectures include reduced learning time and the ease of VLSI implementation. This neurocomputer is based on an analog/digital hybrid design and has been constructed with both custom analog VLSI circuits and a commercially available digital signal processor. The hybrid architecture is selected because it offers high computational performance while compensating for analog inaccuracies, and it features the ability to model large problems.<<ETX>>

[1]  John C. Platt A Resource-Allocating Network for Function Interpolation , 1991, Neural Computation.

[2]  J. Stephen Judd,et al.  On the complexity of loading shallow neural networks , 1988, J. Complex..

[3]  J.N. Babanezhad,et al.  A 20-V four-quadrant CMOS analog multiplier , 1985, IEEE Journal of Solid-State Circuits.

[4]  C. E. Priebe,et al.  Adaptive Gaussian Pattern Classification , 1988 .

[5]  Steven J. Nowlan,et al.  Soft competitive adaptation: neural network learning algorithms based on fitting statistical mixtures , 1991 .

[6]  John Moody,et al.  Fast Learning in Networks of Locally-Tuned Processing Units , 1989, Neural Computation.

[7]  R. P. Lippmann A critical overview of neural network pattern classifiers , 1991, Neural Networks for Signal Processing Proceedings of the 1991 IEEE Workshop.

[8]  Carver Mead,et al.  Analog VLSI and neural systems , 1989 .

[9]  R.P. Lippmann,et al.  Pattern classification using neural networks , 1989, IEEE Communications Magazine.

[10]  Donald F. Specht,et al.  A general regression neural network , 1991, IEEE Trans. Neural Networks.