A Comparative Study of two Self Organising and Structurally Adaptive Dynamic Neural Tree Networks

This paper examines the performance of Dynamic Neural Tree Networks (DNTNs) which perform hierarchical clustering on unlabelled data. DNTNs are a form of competitive learning neural networks where the competitive neurons are created dynamically, forming a tree configuration which represents the structure inherent in the data set. Two such models have been produced independently by Racz and Klotz and Li, Tang and Suen. The models are critically evaluated both theoretically and empirically. The performance of the DNTN models are compared to one another and to standard Competitive networks. DNTNs provide a hierarchically structured clustering technique, that converges quickly and can handle large data sets. Tests run over the models implemented demonstrate the potential of such networks.

[1]  Stanley C. Ahalt,et al.  Competitive learning algorithms for vector quantization , 1990, Neural Networks.

[2]  Teuvo Kohonen,et al.  Self-Organization and Associative Memory , 1988 .

[3]  Janos Racz,et al.  Knowledge representation by dynamic competitive learning techniques , 1991, Defense + Commercial Sensing.

[4]  Y. Y. Tang,et al.  A structurally adaptive neural tree for the recognition of large character set , 1992, Proceedings., 11th IAPR International Conference on Pattern Recognition. Vol.II. Conference B: Pattern Recognition Methodology and Systems.

[5]  Peter H. A. Sneath,et al.  Numerical Taxonomy: The Principles and Practice of Numerical Classification , 1973 .

[6]  Duane DeSieno,et al.  Adding a conscience to competitive learning , 1988, IEEE 1988 International Conference on Neural Networks.

[7]  Vincent Kanade,et al.  Clustering Algorithms , 2021, Wireless RF Energy Transfer in the Massive IoT Era.

[8]  Stephen Grossberg,et al.  A massively parallel architecture for a self-organizing neural pattern recognition machine , 1988, Comput. Vis. Graph. Image Process..