A learnable cellular neural network structure with ratio memory for image processing

In this paper, a learnable cellular neural network (CNN) with space-variant templates and ratio memory (RM) called the RMCNN, is proposed and analyzed. By incorporating both a modified Hebbian learning rule and RM into the CNN architecture, the RMCNN as an associative memory can generate the absolute weights and then transform them into the ratioed A-template weights as the ratio memories for recognition of noisy input patterns. It is found from simulation results that, due to the feature enhancement effect of RM, the RMCNN under constant leakage on template coefficients can store and recognize more patterns than CNN associative memories without RM, but with the same learning rule and the same constant leakage on space-variant template coefficients. For 9/spl times/9 (18/spl times/18) RMCNNs, three (five) patterns can be learned, stored and recognized. Based upon the RMCNN architecture, an experimental CMOS 9/spl times/9 RMCNN chip is designed and fabricated by using 0.35 /spl mu/m CMOS technology. The measurement results have successfully verified the correct functions of RMCNN.

[1]  Chung-Yu Wu,et al.  CMOS current-mode outstar neural networks with long-period analog ratio memory , 1995, Proceedings of ISCAS'95 - International Symposium on Circuits and Systems.

[2]  Leon O. Chua,et al.  Cellular neural networks: applications , 1988 .

[3]  Chung-Yu Wu,et al.  CMOS current-mode neural associative memory design with on-chip learning , 1996, IEEE Trans. Neural Networks.

[4]  A. Lukianiuk Capacity of cellular neural networks as associative memories , 1996, 1996 Fourth IEEE International Workshop on Cellular Neural Networks and their Applications Proceedings (CNNA-96).

[5]  Lin-Bao Yang,et al.  Cellular neural networks: theory , 1988 .

[6]  Zhong Zhang,et al.  On the associative memories in cellular neural networks , 1997, 1997 IEEE International Conference on Systems, Man, and Cybernetics. Computational Cybernetics and Simulation.

[7]  T. Roska Analog events and a dual computing structure using analog and digital circuits and operators , 1988 .

[8]  Chung-Yu Wu,et al.  New share-buffered direct-injection readout structure for infrared detector , 1993, Optics & Photonics.

[9]  M. Brucoli,et al.  An approach to the design of space-varying cellular neural networks for associative memories , 1994, Proceedings of 1994 37th Midwest Symposium on Circuits and Systems.

[10]  Péter Szolgay,et al.  A fast fixed point learning method to implement associative memory on CNNs , 1997 .

[11]  Derong Liu,et al.  Cellular neural networks for associative memories , 1993 .

[12]  Chung-Yu Wu,et al.  The design of cellular neural network with ratio memory for pattern learning and recognition , 2000, Proceedings of the 2000 6th IEEE International Workshop on Cellular Neural Networks and their Applications (CNNA 2000) (Cat. No.00TH8509).

[13]  S. Grossberg Nonlinear difference-differential equations in prediction and learning theory. , 1967, Proceedings of the National Academy of Sciences of the United States of America.

[14]  Chung-Yu Wu,et al.  Analog CMOS current-mode implementation of the feedforward neural network with on-chip learning and storage , 1995, Proceedings of ICNN'95 - International Conference on Neural Networks.

[15]  Giovanni Costantini,et al.  Multiplierless digital learning algorithm for cellular neural networks , 2001 .

[16]  Kari Halonen,et al.  CMOS implementation of associative memory using cellular neural network having adjustable template coefficients , 1994, Proceedings of IEEE International Symposium on Circuits and Systems - ISCAS '94.

[17]  Patrick K. Simpson Neural Networks Applications , 1997 .

[18]  David M. Skapura,et al.  Neural networks - algorithms, applications, and programming techniques , 1991, Computation and neural systems series.