Logical Boltzmann Machines

The idea of representing symbolic knowledge in connectionist systems has been a long-standing endeavour which has attracted much attention recently with the objective of combining machine learning and scalable sound reasoning. Early work has shown a correspondence between propositional logic and symmetrical neural networks which nevertheless did not scale well with the number of variables and whose training regime was inefficient. In this paper, we introduce Logical Boltzmann Machines (LBM), a neurosymbolic system that can represent any propositional logic formula in strict disjunctive normal form. We prove equivalence between energy minimization in LBM and logical satisfiability thus showing that LBM is capable of sound reasoning. We evaluate reasoning empirically to show that LBM is capable of finding all satisfying assignments of a class of logical formulae by searching fewer than 0.75% of the possible (approximately 1 billion) assignments. We compare learning in LBM with a symbolic inductive logic programming system, a state-of-the-art neurosymbolic system and a purely neural network-based system, achieving better learning performance in five out of seven data sets.

[1]  Gary Marcus,et al.  Deep Learning: A Critical Appraisal , 2018, ArXiv.

[2]  Matthew Richardson,et al.  Markov logic networks , 2006, Machine Learning.

[3]  Fan Yang,et al.  TensorLog: Deep Learning Meets Probabilistic DBs , 2017, ArXiv.

[4]  Krysia Broda,et al.  Symbolic knowledge extraction from trained neural networks: A sound approach , 2001, Artif. Intell..

[5]  Richard Evans,et al.  Learning Explanatory Rules from Noisy Data , 2017, J. Artif. Intell. Res..

[6]  Jude W. Shavlik,et al.  Knowledge-Based Artificial Neural Networks , 1994, Artif. Intell..

[7]  Gadi Pinkas,et al.  Reasoning, Nonmonotonicity and Learning in Connectionist Networks that Capture Propositional Knowledge , 1995, Artif. Intell..

[8]  Ashwin Srinivasan,et al.  Mutagenesis: ILP experiments in a non-determinate biological domain , 1994 .

[9]  Luis C. Lamb,et al.  Neurosymbolic AI: the 3rd wave , 2020, Artificial Intelligence Review.

[10]  Artur S. d'Avila Garcez,et al.  A Neural-Symbolic Cognitive Agent for Online Learning and Reasoning , 2011, IJCAI.

[11]  Son N. Tran,et al.  Deep Logic Networks: Inserting and Extracting Knowledge From Deep Belief Networks , 2018, IEEE Transactions on Neural Networks and Learning Systems.

[12]  Artur S. d'Avila Garcez,et al.  Fast relational learning using bottom clause propositionalization with artificial neural networks , 2013, Machine Learning.

[13]  Luc De Raedt,et al.  DeepProbLog: Neural Probabilistic Logic Programming , 2018, BNAIC/BENELEARN.

[14]  Fan Yang,et al.  Differentiable Learning of Logical Rules for Knowledge Base Reasoning , 2017, NIPS.

[15]  Artur S. d'Avila Garcez,et al.  Learning and Reasoning with Logic Tensor Networks , 2016, AI*IA.

[16]  Razvan Pascanu,et al.  Learning Algorithms for the Classification Restricted Boltzmann Machine , 2012, J. Mach. Learn. Res..

[17]  Ashwin Srinivasan,et al.  Relating chemical activity to structure: An examination of ILP successes , 1995, New Generation Computing.