An excellent weight-updating-linearity EEPROM synapse memory cell for self-learning Neuron-MOS neural networks

A new synapse memory cell employing floating-gate EEPROM technology has been developed which is characterized by an excellent weight-updating linearity under the constant-pulse programming. Such a feature has been realized for the first time by employing a simple self-feedback regime in each cell circuitry. The potential of the floating gate is set to the tunneling electrode by the source follower action of the built-in cell circuitry, thus assuring a constant electric field strength in the tunnel oxide at each programming cycle independent of the stored charge in the floating gate. The synapse cell is composed of only seven transistors and inherits all the advanced features of the original six-transistor cell, such as the standby-power free and dual polarity characteristics. In addition, by optimizing the intra-cell coupling capacitance ratios, the acceleration effect in updating the weight has also been accomplished. All these features make the new synapse cell fully compatible with the hardware learning architecture of the Neuron-MOS neural network. The new synapse cell concept has been verified by experiments using test circuits fabricated by a double-polysilicon CMOS process. >

[1]  Tadahiro Ohmi,et al.  A self-learning neural-network LSI using neuron MOSFETs , 1992, 1992 Symposium on VLSI Technology Digest of Technical Papers.

[2]  Tadashi Shibata,et al.  A functional MOS transistor featuring gate-level weighted sum and threshold operations , 1992 .

[3]  Yoshihito Amemiya,et al.  A floating-gate analog memory device for neural networks , 1993 .

[4]  Chenming Calvin Hu,et al.  The EEPROM as an analog memory drive , 1989 .

[5]  Tetsuro Itakura,et al.  Neuro chips with on-chip back-propagation and/or Hebbian learning , 1992 .

[6]  M. Lenzlinger,et al.  Fowler‐Nordheim Tunneling into Thermally Grown SiO2 , 1969 .

[7]  V. Hu,et al.  EEPROM device as a reconfigurable analog element for neural networks , 1989, International Technical Digest on Electron Devices Meeting.

[8]  Atsushi Iwata,et al.  Characteristics of floating gate device as analogue memory for neural networks , 1991 .

[9]  T. Yamada,et al.  A self-learning neural network chip with 125 neurons and 10 K self-organization synapses , 1990, Digest of Technical Papers., 1990 Symposium on VLSI Circuits.

[10]  M. Yagyu,et al.  Design, fabrication and evaluation of a 5-inch wafer scale neural network LSI composed on 576 digital neurons , 1990, 1990 IJCNN International Joint Conference on Neural Networks.

[11]  Lawrence D. Jackel,et al.  An analog neural network processor with programmable topology , 1991 .

[12]  Ping-Keung Ko,et al.  EEPROM as an analog storage device, with particular applications in neutral networks , 1992 .

[13]  D A Durfee,et al.  Comparison of floating gate neural network memory cells in standard VLSI CMOS technology , 1992, IEEE Trans. Neural Networks.

[14]  T. Ohmi,et al.  Hardware-backpropagation learning of neuron MOS neural networks , 1992, 1992 International Technical Digest on Electron Devices Meeting.

[15]  Y. Amemiya,et al.  A high-speed digital neural network chip with low-power chain-reaction architecture , 1992 .

[16]  S. Inoue,et al.  Optimum design of dual-control gate cell for high-density EEPROM's , 1983, IEEE Transactions on Electron Devices.

[17]  R. Pinkham,et al.  An 11-million Transistor Neural Network Execution Engine , 1991, 1991 IEEE International Solid-State Circuits Conference. Digest of Technical Papers.

[18]  Tadahiro Ohmi,et al.  An intelligent MOS transistor featuring gate-level weighted sum and threshold operations , 1991, International Electron Devices Meeting 1991 [Technical Digest].

[19]  Carver Mead,et al.  Analog VLSI and neural systems , 1989 .