Growing fuzzy topology adaptive resonance theory models with a push-pull learning algorithm

A new incrementally growing neural network model, called the growing fuzzy topology ART (GFTART) model, is proposed based on integrating the conventional fuzzy ART model with the incremental topology-preserving mechanism of the growing cell structure (GCS) model. This is in addition, to a new training algorithm, called the push-pull learning algorithm. The proposed GFTART model has two purposes: First, to reduce the proliferation of incrementally generated nodes in the F2 layer by the conventional fuzzy ART model based on replacing each F2 node with a GCS. Second, to enhance the class-dependent clustering representation ability of the GCS model by including the categorization property of the conventional fuzzy ART model. In addition, the proposed push-pull training algorithm enhances the cluster discriminating property and partially improves the forgetting problem of the training algorithm in the GCS model.

[1]  Mark H. Lee,et al.  Error-driven active learning in growing radial basis function networks for early robot learning , 2006, Proceedings 2006 IEEE International Conference on Robotics and Automation, 2006. ICRA 2006..

[2]  Sim Heng Ong,et al.  Colour image segmentation using the self-organizing map and adaptive resonance theory , 2005, Image Vis. Comput..

[3]  Yoshifumi Nishio,et al.  Fuzzy Adaptive Resonance Theory Combining Overlapped Category in consideration of connections , 2008, 2008 IEEE International Joint Conference on Neural Networks (IEEE World Congress on Computational Intelligence).

[4]  Bernd Fritzke,et al.  A Growing Neural Gas Network Learns Topologies , 1994, NIPS.

[5]  Bala Srinivasan,et al.  Dynamic self-organizing maps with controlled growth for knowledge discovery , 2000, IEEE Trans. Neural Networks Learn. Syst..

[6]  Stephen Grossberg,et al.  Competitive Learning: From Interactive Activation to Adaptive Resonance , 1987, Cogn. Sci..

[7]  Shigeo Abe,et al.  Reducing computations in incremental learning for feedforward neural network with long-term memory , 2001, IJCNN'01. International Joint Conference on Neural Networks. Proceedings (Cat. No.01CH37222).

[8]  David M. Skapura,et al.  Neural networks - algorithms, applications, and programming techniques , 1991, Computation and neural systems series.

[9]  Thomas Burwick,et al.  Optimal Algorithmic Complexity of Fuzzy ART , 1998, Neural Processing Letters.

[10]  Bernd Fritzke,et al.  Growing cell structures--A self-organizing network for unsupervised and supervised learning , 1994, Neural Networks.

[11]  Toshimichi Saito,et al.  An Approach to Collaboration of Growing Self-Organizing Maps and Adaptive Resonance Theory Maps , 2007, IEICE Trans. Fundam. Electron. Commun. Comput. Sci..

[12]  Stephen R. Marsland,et al.  A self-organising network that grows when required , 2002, Neural Networks.

[13]  T. Poggio,et al.  Hierarchical models of object recognition in cortex , 1999, Nature Neuroscience.

[14]  Stephen Grossberg,et al.  Fuzzy ARTMAP: A neural network architecture for incremental supervised learning of analog multidimensional maps , 1992, IEEE Trans. Neural Networks.

[15]  Simon Haykin,et al.  Neural Networks: A Comprehensive Foundation , 1998 .

[16]  Thomas Serre,et al.  Object recognition with features inspired by visual cortex , 2005, 2005 IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR'05).

[17]  B L McNaughton,et al.  Brain growth and the cognitive map. , 2000, Proceedings of the National Academy of Sciences of the United States of America.