ANCHOR - A Connectionist Architecture for Hierarchical Nesting of Multiple Heterogeneous Neural Nets

We present a novel connectionist architecture for handling arbitrarily complex neural computations. Artificial Neural Network Compiler for Hierarchical ORganisation (ANCHOR) takes a generalised view of a neuron processing element which we have termed as “Superneuron”; and which accepts n-inputs and outputs m-features where m=1 for an element in a conventional network. A superneuron is a single or higher-order multi-layer perceptron which can be trained individually with support of multiple-learning algorithms from the system. The indistinguishability between a superneuron and a neuron is employed in hierarchical nesting of superneurons up to (theoretically) infinite depth within other superneurons as well as linear or tree-structured cascading. ANCHOR facilitates partitioning of input space into a number of small simpler sub-mappings and has been tested on learning simple boolean functions with hierarchical calls and cascading of simpler superneurons. The issues related to partitioning and self-partitioning for a statistical hierarchical classifier are discussed.

[1]  Peter J. Fleming,et al.  Genetic Algorithms for Multiobjective Optimization: FormulationDiscussion and Generalization , 1993, ICGA.

[2]  Rupert Lange,et al.  MONNET: a software system for modular neural networks based on object passing , 1993, Defense, Security, and Sensing.

[3]  Jerome H. Friedman,et al.  A Recursive Partitioning Decision Rule for Nonparametric Classification , 1977, IEEE Transactions on Computers.

[4]  I. K. Sethi,et al.  Hierarchical Classifier Design Using Mutual Information , 1982, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[5]  Richard Lippmann,et al.  Using Genetic Algorithms to Improve Pattern Classification Performance , 1990, NIPS.

[6]  Richard Lippmann,et al.  Neural Network Classifiers Estimate Bayesian a posteriori Probabilities , 1991, Neural Computation.

[7]  A. Linden,et al.  Combining multiple neural network paradigms and applications using SESAME , 1992, [Proceedings 1992] IJCNN International Joint Conference on Neural Networks.

[8]  Michael I. Jordan,et al.  Task Decomposition Through Competition in a Modular Connectionist Architecture: The What and Where Vision Tasks , 1990, Cogn. Sci..

[9]  Jack Sklansky,et al.  On Automatic Feature Selection , 1988, Int. J. Pattern Recognit. Artif. Intell..

[10]  Geoffrey E. Hinton,et al.  Learning representations by back-propagating errors , 1986, Nature.

[11]  Laveen N. Kanal,et al.  Problem-Solving Models and Search Strategies for Pattern Recognition , 1979, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[12]  King-Sun Fu,et al.  A Nonparametric Partitioning Procedure for Pattern Classification , 1969, IEEE Transactions on Computers.

[13]  Michael I. Jordan,et al.  Hierarchies of Adaptive Experts , 1991, NIPS.

[14]  David K. Andes,et al.  The Mod 2 Neurocomputer system design , 1992, IEEE Trans. Neural Networks.

[15]  Alfred V. Aho,et al.  Compilers: Principles, Techniques, and Tools , 1986, Addison-Wesley series in computer science / World student series edition.

[16]  D.E. Goldberg,et al.  Classifier Systems and Genetic Algorithms , 1989, Artif. Intell..