A novel learning framework of CMAC via grey-area-time credit apportionment and grey learning rate

The advantages of CMAC neural network are fast learning convergence, capable of mapping nonlinear functions quickly due to its local generalization of weight updating, simple architecture, easily processing and hardware implementation. In the training phase, the disadvantage of some CMAC models with a larger fixed learning rate is the unstable phenomenon. The smaller learning rate would cause slower convergence speed. In the aspect, we propose grey learning rate for training phase. We incorporate the grey relational analysis with the number of training iterations to get an adequate learning rate for better convergence performance. In addition, a serious problem of learning interference reduces learning speed and accuracy. The idea is that the error correcting must be proportional to the inverse of learning times, trained input area and grey relational grade for the addressed hyper cube. A credit apportionment adopts the idea to provide fast and accurate learning effects. This paper proposes a novel learning framework of CMAC for better performance and real-time applications. From the simulation results, it is evident that the proposed algorithm provides more accurate and fast convergence in the early cycles of training phase and also becomes significant in the follow-up cycles.

[1]  James S. Albus,et al.  New Approach to Manipulator Control: The Cerebellar Model Articulation Controller (CMAC)1 , 1975 .

[2]  Luis Weruaga,et al.  Tikhonov training of the CMAC neural network , 2006, IEEE Transactions on Neural Networks.

[3]  Chih-Ming Chen,et al.  A self-organizing HCMAC neural-network classifier , 2003, IEEE Trans. Neural Networks.

[4]  David E. Thompson,et al.  Neighborhood sequential and random training techniques for CMAC , 1995, IEEE Trans. Neural Networks.

[5]  N.B. Shroff,et al.  Joint resource allocation and base-station assignment for the downlink in CDMA networks , 2006, IEEE/ACM Transactions on Networking.

[6]  Hung-Ching Lu,et al.  Enhance the performance of CMAC neural network via fuzzy theory and credit apportionment , 2002, Proceedings of the 2002 International Joint Conference on Neural Networks. IJCNN'02 (Cat. No.02CH37290).

[7]  Chun-Shin Lin,et al.  Learning convergence of CMAC technique , 1997, IEEE Trans. Neural Networks.

[8]  Shun-Feng Su,et al.  Credit assigned CMAC and its application to online learning robust controllers , 2003, IEEE Trans. Syst. Man Cybern. Part B.

[9]  W. T. Miller,et al.  CMAC: an associative neural network alternative to backpropagation , 1990, Proc. IEEE.

[10]  Ming-Feng Yeh,et al.  A self-organizing CMAC network with gray credit assignment , 2006, IEEE Trans. Syst. Man Cybern. Part B.

[11]  Zne-Jung Lee,et al.  Robust and fast learning for fuzzy cerebellar model articulation controllers , 2006, IEEE Transactions on Systems, Man, and Cybernetics, Part B (Cybernetics).

[12]  Ming-Feng Yeh,et al.  On-line adaptive quantization input space in CMAC neural network , 2002, IEEE International Conference on Systems, Man and Cybernetics.

[13]  James S. Albus,et al.  Data Storage in the Cerebellar Model Articulation Controller (CMAC) , 1975 .