First-order logic learning in Artificial Neural Networks

Artificial Neural Networks have previously been applied in neuro-symbolic learning to learn ground logic program rules. However, there are few results of learning relations using neuro-symbolic learning. This paper presents the system PAN, which can learn relations. The inputs to PAN are one or more atoms, representing the conditions of a logic rule, and the output is the conclusion of the rule. The symbolic inputs may include functional terms of arbitrary depth and arity, and the output may include terms constructed from the input functors. Symbolic inputs are encoded as an integer using an invertible encoding function, which is used in reverse to extract the output terms. The main advance of this system is a convention to allow construction of Artificial Neural Networks able to learn rules with the same power of expression as first order definite clauses. The system is tested on three examples and the results are discussed.

[1]  Thomas G. Dietterich What is machine learning? , 2020, Archives of Disease in Childhood.

[2]  Stephen Muggleton,et al.  An Experimental Comparison of Human and Machine Learning Formalisms , 1989, ML.

[3]  Dov M. Gabbay,et al.  Neural-Symbolic Cognitive Reasoning , 2008, Cognitive Technologies.

[4]  Jens Lehmann,et al.  Extracting reduced logic programs from artificial neural networks , 2010, Applied Intelligence.

[5]  Krysia Broda,et al.  Neural-symbolic learning systems - foundations and applications , 2012, Perspectives in neural computing.

[6]  Dov M. Gabbay,et al.  Connectionist modal logic: Representing modalities in neural networks , 2007, Theor. Comput. Sci..

[7]  H. K. D. H. Bhadeshia,et al.  Neural Networks in Materials Science , 1999 .

[8]  Krysia Broda,et al.  Using inductive types for ensuring correctness of neuro-symbolic computations , 2010 .

[9]  Krysia Broda,et al.  Symbolic knowledge extraction from trained neural networks: A sound approach , 2001, Artif. Intell..

[10]  Krysia Broda,et al.  Neural-Symbolic Learning Systems , 2002 .

[11]  Boonserm Kijsirikul,et al.  First-order logical neural networks , 2004, Fourth International Conference on Hybrid Intelligent Systems (HIS'04).

[12]  Heekuck Oh,et al.  Neural Networks for Pattern Recognition , 1993, Adv. Comput..

[13]  Stephen Muggleton,et al.  Inverse entailment and progol , 1995, New Generation Computing.

[14]  Pascal Hitzler,et al.  Connectionist model generation: A first-order approach , 2008, Neurocomputing.

[15]  Steffen Hölldobler,et al.  Towards a New Massively Parallel Computational Model for Logic Programming , 1994 .

[16]  Bernard Widrow,et al.  Neural networks: applications in industry, business and science , 1994, CACM.

[17]  Krysia Broda,et al.  Connectionist Artificial Neural Networks , 2009 .

[18]  Hendrik Blockeel,et al.  Experiments with relational neural networks , 2005 .

[19]  Ryszard S. Michalski,et al.  Pattern Recognition as Rule-Guided Inductive Inference , 1980, IEEE Transactions on Pattern Analysis and Machine Intelligence.