FONN: Combining First Order Logic with Connectionist Learning

This paper presents a neural network architecture that can manage structured data and reene knowledge bases expressed in a rst order logic language. The presented framework is well suited to classiication problems in which concept descriptions depend upon numerical features of the data. In fact, the main goal of the neural architecture is that of reening the numerical part of the knowledge base, without changing its structure. In particular, we discuss a method to translate a set of classiication rules into neural computation units. Here, we focus our attention on the translation method and on algorithms to reene network weights on struc-tured data. The classiication theory to be reened can be manually handcrafted or automatically acquired by a symbolic relational learning system able to deal with numerical features. As a matter of fact, the primary goal is to bring into a neural network architecture the capability of dealing with structured data of unrestricted size, by allowing to dynamically bind the classiication rules to diierent items occurring in the input data. An extensive experimentation on a challenging artiicial case study shows that the network converges quite fastly and generalizes much better than propositional learners on an equivalent task deenition.