On the Efficient Classification of Data Structures by Neural Networks

In the last few years it has been shown that recurrent neural networks are adequate for processing general data structures like trees and graphs, which opens the doors to a number of new interesting applications previously unexplored. In this paper, we analyze the efficiency of learning the membership of DO AGs (Directed Ordered Acyclic Graphs) in terms of local minima of the error surface by relying on the principle that their absence is a guarantee of efficient learning. We give sufficient conditions under which the error surface is local minima free. Specifically, we define a topological index associated wi th a collection of DOAGs that makes it possible to design the architecture so as to avoid local minima.

[1]  Alberto Tesi,et al.  On the Problem of Local Minima in Backpropagation , 1992, IEEE Trans. Pattern Anal. Mach. Intell..

[2]  Michael A. Arbib,et al.  Algebra Automata I: Parallel Programming as a Prolegomena to the Categorical Approach , 1968, Inf. Control..

[3]  Leonard G. C. Hamey Comments on "Can backpropagation error surface not have local minima?" , 1994, IEEE Trans. Neural Networks.

[4]  YoungJu Choie,et al.  Local minima and back propagation , 1991, IJCNN-91-Seattle International Joint Conference on Neural Networks.

[5]  Jordan B. Pollack,et al.  Recursive Distributed Representations , 1990, Artif. Intell..

[6]  Eduardo D. Sontag,et al.  Backpropagation separates when perceptrons do , 1989, International 1989 Joint Conference on Neural Networks.

[7]  Alessandro Sperduti,et al.  Learning Distributed Representations for the Classification of Terms , 1995, IJCAI.

[8]  Alessandro Sperduti,et al.  Supervised neural networks for the classification of structures , 1997, IEEE Trans. Neural Networks.

[9]  X H Yu,et al.  On the local minima free condition of backpropagation learning , 1995, IEEE Trans. Neural Networks.

[10]  Marvin Minsky,et al.  Perceptrons: expanded edition , 1988 .

[11]  Xiao-Hu Yu,et al.  Can backpropagation error surface not have local minima , 1992, IEEE Trans. Neural Networks.

[12]  J. Slawny,et al.  Back propagation fails to separate where perceptrons succeed , 1989 .