Heterogeneous adaptive systems

Most adaptive systems are homogenous, i.e., they are built from processing elements of the same type. MLP neural networks and decision trees use nodes that partition the input space by hyperplanes. Other types of neural networks use nodes that provide spherical or ellipsoidal decision borders. This may not be the best inductive bias for a given data, frequently requiring a large number of processing elements even in cases when simple solutions exist. In heterogeneous adaptive systems different types of decision borders are used at each stage, enabling the discovery of the most appropriate bias for the data. The neural decision tree and similarity-based systems of this kind are described here. Results from a novel heterogeneous decision tree algorithm are presented as an example of this approach.

[1]  David G. Stork,et al.  Pattern Classification , 1973 .

[2]  Henry Markram,et al.  Real-Time Computing Without Stable States: A New Framework for Neural Computation Based on Perturbations , 2002, Neural Computation.

[3]  Belur V. Dasarathy,et al.  Nearest neighbor (NN) norms: NN pattern classification techniques , 1991 .

[4]  Brian D. Ripley,et al.  Pattern Recognition and Neural Networks , 1996 .

[5]  Christopher M. Bishop,et al.  Neural networks and machine learning , 1998 .

[6]  Wlodzislaw Duch,et al.  Feature space mapping as a universal adaptive system , 1995 .

[7]  Michael A. Arbib,et al.  The handbook of brain theory and neural networks , 1995, A Bradford book.

[8]  Wlodzislaw Duch,et al.  Transfer functions: hidden possibilities for better neural networks , 2001, ESANN.

[9]  Visakan Kadirkamanathan,et al.  Statistical Control of RBF-like Networks for Classification , 1997, ICANN.

[10]  Christopher J. Merz,et al.  UCI Repository of Machine Learning Databases , 1996 .

[11]  Jean-Pierre Nadal,et al.  Neural trees: a new tool for classification , 1990 .

[12]  Wlodzislaw Duch,et al.  Taxonomy of neural transfer functions , 2000, Proceedings of the IEEE-INNS-ENNS International Joint Conference on Neural Networks. IJCNN 2000. Neural Computing: New Challenges and Perspectives for the New Millennium.

[13]  Norbert Jankowski,et al.  Survey of Neural Transfer Functions , 1999 .

[14]  Yoh-Han Pao,et al.  Adaptive pattern recognition and neural networks , 1989 .

[15]  Kumar Chellapilla,et al.  On Making Problems Evolutionarily Friendly - Part 1: Evolving the Most Convenient Representations , 1998, Evolutionary Programming.

[16]  Wlodzislaw Duch,et al.  Optimal transfer function neural networks , 2001, ESANN.

[17]  L. Breiman Bias-variance, regularization, instability and stabilization , 1998 .

[18]  Wlodzislaw Duch,et al.  Neural Networks in Non-Euclidean Spaces , 1999, Neural Processing Letters.

[19]  Wlodzislaw Duch,et al.  A new methodology of extraction, optimization and application of crisp and fuzzy logical rules , 2001, IEEE Trans. Neural Networks.

[20]  David G. Stork,et al.  Pattern Classification (2nd ed.) , 1999 .

[21]  Nello Cristianini,et al.  An Introduction to Support Vector Machines and Other Kernel-based Learning Methods , 2000 .

[22]  D. Kibler,et al.  Instance-based learning algorithms , 2004, Machine Learning.

[23]  David L. Waltz,et al.  Memory-based reasoning , 1998 .

[24]  Wlodzislaw Duch,et al.  Constructive density estimation network based on several different separable transfer functions , 2001, ESANN.

[25]  Shigeo Abe DrEng Pattern Classification , 2001, Springer London.

[26]  B. Scholkopf,et al.  Fisher discriminant analysis with kernels , 1999, Neural Networks for Signal Processing IX: Proceedings of the 1999 IEEE Signal Processing Society Workshop (Cat. No.98TH8468).