The analysis of the addition of stochasticity to a neural tree classifier

Abstract This paper describes various mechanisms for adding stochasticity to a dynamic hierarchical neural clusterer. Such a network grows a tree-structured neural classifier dynamically in response to the unlabelled data with which it is presented. Experiments are undertaken to evaluate the effects of this addition of stochasticity. These tests were carried out using two sets of internal parameters, that define the characteristics of the neural clusterer. A Genetic Algorithm (GA) using appropriate cluster criterion measures in its fitness function was used to search the parameter space for these instantiations. It was found that the addition of non-determinism produced more reliable clustering performances especially on unseen real-world data. Finally, deliberately changing the tree shape by varying key parameters was investigated, illustrated and systematically analysed.

[1]  J. William Ahwood,et al.  CLASSIFICATION , 1931, Foundations of Familiar Language.

[2]  Allen Gersho,et al.  Competitive Learning and Soft Competition for , 1997 .

[3]  Anders Krogh,et al.  Introduction to the theory of neural computation , 1994, The advanced book program.

[4]  John A. Hartigan,et al.  Clustering Algorithms , 1975 .

[5]  R. Palmer,et al.  Introduction to the theory of neural computation , 1994, The advanced book program.

[6]  Neil Davey,et al.  Optimising a neural tree classifier using a genetic algorithm , 2000, KES'2000. Fourth International Conference on Knowledge-Based Intelligent Engineering Systems and Allied Technologies. Proceedings (Cat. No.00TH8516).

[7]  B. S. Everitt,et al.  Cluster analysis , 2014, Encyclopedia of Social Network Analysis and Mining.

[8]  G. W. Milligan,et al.  An examination of procedures for determining the number of clusters in a data set , 1985 .

[9]  Neil Davey,et al.  Hierarchical classification with a competitive evolutionary neural tree , 1999, Neural Networks.