Enhancement of the weight cell utilization for CMAC neural networks: architecture design and hardware implementation

CMAC neural network model has the advantages of fast learning and insensitivity to the order of presentation of training data. However, it may suffer from a huge storage requirement for realizing the weight cell memory. In this paper, we propose a memory banking structure and a direct weight cell address mapping scheme, which can sharply reduce the required address space of weight cell memory. This mapping scheme also exhibits a fast computation speed in generating weight cell addresses. Besides, a pipelined architecture is developed to realize the CMAC chip. To efficiently manage design complexity and increase design productivity and maintainability, a high-level synthesis technique is adopted to perform the task of logic design of the CMAC chip.

[1]  J. Albus A Theory of Cerebellar Function , 1971 .

[2]  D. Ellison,et al.  On the Convergence of the Multidimensional Albus Perceptron , 1991, Int. J. Robotics Res..

[3]  W. Thomas Miller,et al.  Deconvolution using a CMAC neural network , 1988, Neural Networks.

[4]  Yau-Hwang Kuo,et al.  A CMAC neural network chip for color correction , 1994, Proceedings of 1994 IEEE International Conference on Neural Networks (ICNN'94).

[5]  W. T. Miller,et al.  Shape Recognition Using A CMAC Based Learning System , 1988, Other Conferences.

[6]  W. Thomas Miller,et al.  A Nonlinear Learning Controller for Robotic Manipulators , 1987, Other Conferences.

[7]  Alice C. Parker,et al.  Tutorial on high-level synthesis , 1988, DAC '88.

[8]  Stephen H. Lane,et al.  Higher-Order CMAC Neural Networks - Theory and Practice , 1991, 1991 American Control Conference.

[9]  James S. Albus,et al.  New Approach to Manipulator Control: The Cerebellar Model Articulation Controller (CMAC)1 , 1975 .

[10]  Gao-Wei Chang,et al.  Neural plant inverse control approach to color error reduction for scanner and printer , 1993, IEEE International Conference on Neural Networks.