A Comparison between Recursive Neural Networks and Graph Neural Networks

Recursive neural networks (RNNs) and graph neural networks (GNNs) are two connectionist models that can directly process graphs. RNNs and GNNs exploit a similar processing framework, but they can be applied to different input domains. RNNs require the input graphs to be directed and acyclic, whereas GNNs can process any kind of graphs. The aim of this paper consists in understanding whether such a difference affects the behaviour of the models on a real application. An experimental comparison on an image classification problem is presented, showing that GNNs outperforms RNNs. Moreover the main differences between the models are also discussed w.r.t. their input domains, their approximation capabilities and their learning algorithms.

[1]  Franco Scarselli,et al.  Learning User Profiles in NAUTILUS , 2000, AH.

[2]  Yves Chauvin,et al.  Back-Propagation: Theory, Architecture, and Applications , 1995 .

[3]  F. Scarselli,et al.  A new model for learning in graph domains , 2005, Proceedings. 2005 IEEE International Joint Conference on Neural Networks, 2005..

[4]  H. Bunke Graph Matching : Theoretical Foundations , Algorithms , and Applications , 2022 .

[5]  Alessandro Sperduti,et al.  A general framework for adaptive processing of data structures , 1998, IEEE Trans. Neural Networks.

[6]  Thomas Gärtner,et al.  A survey of kernels for structured data , 2003, SKDD.

[7]  Franco Scarselli,et al.  Face Spotting in Color Images using Recursive Neural Networks , 2003 .

[8]  Barbara Hammer,et al.  Neural methods for non-standard data , 2004, ESANN.

[9]  Dorin Comaniciu,et al.  Mean Shift: A Robust Approach Toward Feature Space Analysis , 2002, IEEE Trans. Pattern Anal. Mach. Intell..

[10]  Emiel Krahmer,et al.  Graph-Based Generation of Referring Expressions , 2003, CL.

[11]  Martin A. Riedmiller,et al.  A direct adaptive method for faster backpropagation learning: the RPROP algorithm , 1993, IEEE International Conference on Neural Networks.

[12]  M. Newman,et al.  The structure of scientific collaboration networks. , 2000, Proceedings of the National Academy of Sciences of the United States of America.

[13]  Jun Suzuki,et al.  Convolution Kernels with Feature Selection for Natural Language Processing Tasks , 2004, ACL.

[14]  William J. Christmas,et al.  Structural Matching in Computer Vision Using Probabilistic Relaxation , 1995, IEEE Trans. Pattern Anal. Mach. Intell..

[15]  Franco Scarselli,et al.  Face Localization with Recursive Neural Networks , 2003, WIRN.

[16]  Risi Kondor,et al.  Diffusion kernels on graphs and other discrete structures , 2002, ICML 2002.

[17]  Tatsuya Akutsu,et al.  Extensions of marginalized graph kernels , 2004, ICML.

[18]  Ah Chung Tsoi,et al.  A self-organizing map for adaptive processing of structured data , 2003, IEEE Trans. Neural Networks.

[19]  Alessandro Sperduti,et al.  Supervised neural networks for the classification of structures , 1997, IEEE Trans. Neural Networks.

[20]  Pineda,et al.  Generalization of back-propagation to recurrent neural networks. , 1987, Physical review letters.

[21]  Christoph Goller,et al.  Inductive Learning in Symbolic Domains Using Structure-Driven Recurrent Neural Networks , 1996, KI.

[22]  Barbara Hammer,et al.  Approximation capabilities of folding networks , 1999, ESANN.

[23]  J. A. Hartigan,et al.  A k-means clustering algorithm , 1979 .

[24]  Giovanni Soda,et al.  Logo Recognition by Recursive Neural Networks , 1997, GREC.

[25]  Kurt Mehlhorn,et al.  Graph Algorithm and NP-Completeness , 1984 .

[26]  Franco Scarselli,et al.  Recursive neural networks for processing graphs with labelled edges: theory and applications , 2005, Neural Networks.

[27]  Luís B. Almeida,et al.  A learning rule for asynchronous perceptrons with feedback in a combinatorial environment , 1990 .

[28]  Pietro Perona,et al.  A sparse object category model for efficient learning and exhaustive recognition , 2005, 2005 IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR'05).

[29]  Giovanni Soda,et al.  Towards Incremental Parsing of Natural Language Using Recursive Neural Networks , 2003, Applied Intelligence.

[30]  C. Goller,et al.  Relating Chemical Structure to Activity: An Application of the Neural Folding Architecture , 1998 .

[31]  Ah Chung Tsoi,et al.  Graph neural networks for ranking Web pages , 2005, The 2005 IEEE/WIC/ACM International Conference on Web Intelligence (WI'05).

[32]  W. A. Kirk,et al.  An Introduction to Metric Spaces and Fixed Point Theory , 2001 .

[33]  George Cybenko,et al.  Approximation by superpositions of a sigmoidal function , 1989, Math. Control. Signals Syst..

[34]  Ehud Gudes,et al.  Exploiting local similarity for indexing paths in graph-structured data , 2002, Proceedings 18th International Conference on Data Engineering.

[35]  Ah Chung Tsoi,et al.  Universal Approximation Using Feedforward Neural Networks: A Survey of Some Existing Methods, and Some New Results , 1998, Neural Networks.

[36]  Pierre Baldi,et al.  The Principled Design of Large-Scale Recursive Neural Network Architectures--DAG-RNNs and the Protein Structure Prediction Problem , 2003, J. Mach. Learn. Res..