Performance Optimization of Adaptive Resonance Neural Networks Using Genetic Algorithms

We present a hybrid clustering system that is based on the adaptive resonance theory 1 (ART1) artificial neural network (ANN) with a genetic algorithm (GA) optimizer, to improve the ART1 ANN settings. As a case study, we will consider text clustering. The core of our experiments will be the quality of clustering, multi-dimensional domain space of ART1 design parameters has many possible combinations of values that yield high clustering quality. These design parameters are hard to estimate manually. We proposed GA to find some of these sets. Results show better clustering and simpler quality estimator when compared with the existing techniques. We call this algorithm genetically engineered parameters ART1 or ARTgep

[1]  Michalis E. Zervakis,et al.  A genetically optimized artificial neural network structure for feature extraction and classification of vascular tissue fluorescence spectrums , 2000, Proceedings Fifth IEEE International Workshop on Computer Architectures for Machine Perception.

[2]  X. Yao Evolving Artificial Neural Networks , 1999 .

[3]  Louis Massey,et al.  Determination of Clustering Tendency With ART Neural Networks , 2002 .

[4]  Ah-Hwee Tan,et al.  Self-organizing neural networks for efficient clustering of gene expression data , 2003, Proceedings of the International Joint Conference on Neural Networks, 2003..

[5]  L. Massey,et al.  Evaluating quality of text clustering with ART1 , 2003, Proceedings of the International Joint Conference on Neural Networks, 2003..

[6]  G.S. May,et al.  Optimization of neural network structure and learning parameters using genetic algorithms , 1996, Proceedings Eighth IEEE International Conference on Tools with Artificial Intelligence.

[7]  Stephen Grossberg,et al.  A massively parallel architecture for a self-organizing neural pattern recognition machine , 1988, Comput. Vis. Graph. Image Process..

[8]  Peter Norvig,et al.  Artificial intelligence - a modern approach, 2nd Edition , 2003, Prentice Hall series in artificial intelligence.

[9]  R. Lippmann,et al.  An introduction to computing with neural nets , 1987, IEEE ASSP Magazine.

[10]  Laurene V. Fausett,et al.  Fundamentals Of Neural Networks , 1994 .

[11]  Eytan Ruppin,et al.  Feature Selection Based on the Shapley Value , 2005, IJCAI.

[12]  Jianhong Wu,et al.  Dynamics of projective adaptive resonance theory model: the foundation of PART algorithm , 2004, IEEE Transactions on Neural Networks.

[13]  Evgeniy Gabrilovich,et al.  Parameterized generation of labeled datasets for text categorization based on a hierarchical directory , 2004, SIGIR '04.

[14]  Ah-Hwee Tan,et al.  ART-C: a neural architecture for self-organization under constraints , 2002, Proceedings of the 2002 International Joint Conference on Neural Networks. IJCNN'02 (Cat. No.02CH37290).

[15]  Heribert Popp,et al.  Genetically optimized neural network classifiers for bankruptcy prediction-an empirical study , 1996, Proceedings of HICSS-29: 29th Hawaii International Conference on System Sciences.

[16]  D. Boeringer,et al.  A simultaneous parameter adaptation scheme for genetic algorithms with application to phased array synthesis , 2005, IEEE Transactions on Antennas and Propagation.

[17]  Sheng-Fuu Lin,et al.  Adaptive hamming net: A fast-learning ART 1 model without searching , 1995, Neural Networks.

[18]  Peter Norvig,et al.  Artificial Intelligence: A Modern Approach , 1995 .

[19]  C. Mallows,et al.  A Method for Comparing Two Hierarchical Clusterings , 1983 .

[20]  Ethem Alpaydin,et al.  Constructive feedforward ART clustering networks. I , 2002, IEEE Trans. Neural Networks.

[21]  Kevin Bluff,et al.  Genetic optimisation of control parameters of a neural network , 1995, Proceedings 1995 Second New Zealand International Two-Stream Conference on Artificial Neural Networks and Expert Systems.

[22]  David D. Lewis,et al.  Reuters-21578 Text Categorization Test Collection, Distribution 1.0 , 1997 .