Supervised neural networks for the classification of structures

Standard neural networks and statistical methods are usually believed to be inadequate when dealing with complex structures because of their feature-based approach. In fact, feature-based approaches usually fail to give satisfactory solutions because of the sensitivity of the approach to the a priori selection of the features, and the incapacity to represent any specific information on the relationships among the components of the structures. However, we show that neural networks can, in fact, represent and classify structured patterns. The key idea underpinning our approach is the use of the so called "generalized recursive neuron", which is essentially a generalization to structures of a recurrent neuron. By using generalized recursive neurons, all the supervised networks developed for the classification of sequences, such as backpropagation through time networks, real-time recurrent networks, simple recurrent networks, recurrent cascade correlation networks, and neural trees can, on the whole, be generalized to structures. The results obtained by some of the above networks (with generalized recursive neurons) on the classification of logic terms are presented.

[1]  Alessandro Sperduti,et al.  Stability properties of labeling recursive auto-associative memory , 1995, IEEE Trans. Neural Networks.

[2]  R. Mammone,et al.  Neural tree networks , 1992 .

[3]  Luís B. Almeida,et al.  A learning rule for asynchronous perceptrons with feedback in a combinatorial environment , 1990 .

[4]  G. Kane Parallel Distributed Processing: Explorations in the Microstructure of Cognition, vol 1: Foundations, vol 2: Psychological and Biological Models , 1994 .

[5]  Christian Lebiere,et al.  The Cascade-Correlation Learning Architecture , 1989, NIPS.

[6]  Alessandro Sperduti,et al.  An Example of Neural Code: Neural Trees Implemented by LRAAMs , 1993 .

[7]  Alessandro Sperduti,et al.  Modular Labeling RAAM , 1995, ICANNGA.

[8]  Jordan B. Pollack,et al.  Recursive Distributed Representations , 1990, Artif. Intell..

[9]  Stephen Grossberg,et al.  Absolute stability of global pattern formation and parallel memory storage by competitive neural networks , 1983, IEEE Transactions on Systems, Man, and Cybernetics.

[10]  Alessandro Sperduti,et al.  Encoding Labeled Graphs by Labeling RAAM , 1993, NIPS.

[11]  Alessandro Sperduti,et al.  Learning Distributed Representations for the Classification of Terms , 1995, IJCAI.

[12]  C. Lee Giles,et al.  Constructive learning of recurrent neural networks: limitations of recurrent cascade correlation and a simple solution , 1995, IEEE Trans. Neural Networks.

[13]  Wayne Ieee,et al.  Entropy Nets: From Decision Trees to Neural Networks , 1990 .

[14]  Luc De Raedt,et al.  Inductive Logic Programming: Theory and Methods , 1994, J. Log. Program..

[15]  M. P. Perrone,et al.  A soft-competitive splitting rule for adaptive tree-structured neural networks , 1992, [Proceedings 1992] IJCNN International Joint Conference on Neural Networks.

[16]  Fernando J. Pineda,et al.  Dynamics and architecture for neural computation , 1988, J. Complex..

[17]  Richard J. Mammone,et al.  Combining neural networks and decision trees , 1991 .

[18]  A. Jennings,et al.  Structurally adaptive self-organizing neural trees , 1992, [Proceedings 1992] IJCNN International Joint Conference on Neural Networks.

[19]  Alessandro Sperduti,et al.  Labelling Recursive Auto-associative Memory , 1994, Connect. Sci..

[20]  Ronald A. Cole,et al.  A performance comparison of trained multilayer perceptrons and trained classification trees , 1990 .

[21]  Jean-Pierre Nadal,et al.  Neural trees: a new tool for classification , 1990 .

[22]  Scott E. Fahlman,et al.  The Recurrent Cascade-Correlation Architecture , 1990, NIPS.

[23]  Amir F. Atiya Learning on a General Network , 1987, NIPS.

[24]  Nathan Intrator,et al.  Unsupervised splitting rules for neural tree classifiers , 1992, [Proceedings 1992] IJCNN International Joint Conference on Neural Networks.

[25]  J J Hopfield,et al.  Neurons with graded response have collective computational properties like those of two-state neurons. , 1984, Proceedings of the National Academy of Sciences of the United States of America.

[26]  James L. McClelland,et al.  James L. McClelland, David Rumelhart and the PDP Research Group, Parallel distributed processing: explorations in the microstructure of cognition . Vol. 1. Foundations . Vol. 2. Psychological and biological models . Cambridge MA: M.I.T. Press, 1987. , 1989, Journal of Child Language.

[27]  Ronald J. Williams,et al.  A Learning Algorithm for Continually Running Fully Recurrent Neural Networks , 1989, Neural Computation.

[28]  James L. McClelland,et al.  Parallel distributed processing: explorations in the microstructure of cognition, vol. 1: foundations , 1986 .

[29]  A. Sperduti Labeling Raam , 1994 .

[30]  O. Firschein,et al.  Syntactic pattern recognition and applications , 1983, Proceedings of the IEEE.

[31]  Robert J. Marks,et al.  A performance comparison of trained multilayer perceptrons and trained classification trees , 1989, Conference Proceedings., IEEE International Conference on Systems, Man and Cybernetics.

[32]  M. Katharine Holloway,et al.  Reviews in Computational Chemistry, Volume 10 Edited by Kenny B. Lipkowitz and Donald B. Boyd. VCH Publishers, Inc., New York. 1997. xxiii + 334 pp. 16 × 24 cm. ISBN 1-56081-957-X. $120.00. , 1998 .

[33]  Jeffrey L. Elman,et al.  Finding Structure in Time , 1990, Cogn. Sci..

[34]  Leo Breiman,et al.  Classification and Regression Trees , 1984 .

[35]  Dennis H. Rouvray,et al.  Computational chemical graph theory , 1990 .