Neural Markov Logic Networks

We introduce Neural Markov Logic Networks (NMLNs), a statistical relational learning system that borrows ideas from Markov logic. Like Markov Logic Networks (MLNs), NMLNs are an exponential-family model for modelling distributions over possible worlds, but unlike MLNs, they do not rely on explicitly specified first-order logic rules. Instead, NMLNs learn an implicit representation of such rules as a neural network that acts as a potential function on fragments of the relational structure. Interestingly, any MLN can be represented as an NMLN. Similarly to recently proposed Neural theorem provers (NTPs) [Rockt\"aschel and Riedel, 2017], NMLNs can exploit embeddings of constants but, unlike NTPs, NMLNs work well also in their absence. This is extremely important for predicting in settings other than the transductive one. We showcase the potential of NMLNs on knowledge-base completion tasks and on generation of molecular (graph) data.

[1]  Matthew Richardson,et al.  Markov logic networks , 2006, Machine Learning.

[2]  Edward Grefenstette,et al.  Learning Reasoning Strategies in End-to-End Differentiable Proving , 2020, ICML.

[3]  Michael I. Jordan,et al.  Graphical Models, Exponential Families, and Variational Inference , 2008, Found. Trends Mach. Learn..

[4]  Jun Zhao,et al.  Knowledge Graph Embedding via Dynamic Mapping Matrix , 2015, ACL.

[5]  Kristian Kersting,et al.  Gradient-based boosting for statistical relational learning: the Markov logic network and missing data cases , 2015, Machine Learning.

[6]  David Poole,et al.  Representing Aggregators in Relational Probabilistic Models , 2015, AAAI.

[7]  Stephen P. Boyd,et al.  Convex Optimization , 2004, Algorithms and Theory of Computation Handbook.

[8]  George Papadatos,et al.  The ChEMBL database in 2017 , 2016, Nucleic Acids Res..

[9]  Zhendong Mao,et al.  Knowledge Graph Embedding: A Survey of Approaches and Applications , 2017, IEEE Transactions on Knowledge and Data Engineering.

[10]  Luc De Raedt,et al.  DeepProbLog: Neural Probabilistic Logic Programming , 2018, BNAIC/BENELEARN.

[11]  Steven Schockaert,et al.  Lifted Relational Neural Networks: Efficient Learning of Latent Relational Structures , 2018, J. Artif. Intell. Res..

[12]  Seyed Mehran Kazemi,et al.  RelNN: A Deep Neural Model for Relational Learning , 2017, AAAI.

[13]  Pedro M. Domingos,et al.  Sound and Efficient Inference with Probabilistic and Deterministic Dependencies , 2006, AAAI.

[14]  Marco Gori,et al.  Integrating Learning and Reasoning with Deep Logic Models , 2019, ECML/PKDD.

[15]  Sebastijan Dumancic,et al.  From Statistical Relational to Neuro-Symbolic Artificial Intelligence , 2020, IJCAI.

[16]  Christian P. Robert,et al.  Monte Carlo Statistical Methods , 2005, Springer Texts in Statistics.

[17]  Manfred Jaeger,et al.  Inference, Learning, and Population Size: Projectivity for SRL Models , 2018, ArXiv.

[18]  Vibhav Gogate,et al.  On Lifting the Gibbs Sampling Algorithm , 2012, StarAI@UAI.

[19]  Jure Leskovec,et al.  How Powerful are Graph Neural Networks? , 2018, ICLR.

[20]  Steven Schockaert,et al.  Relational Marginal Problems: Theory and Estimation , 2017, AAAI.

[21]  Razvan Pascanu,et al.  Learning Deep Generative Models of Graphs , 2018, ICLR 2018.

[22]  Guillaume Bouchard,et al.  Knowledge Graph Completion via Complex Tensor Factorization , 2017, J. Mach. Learn. Res..

[23]  Danqi Chen,et al.  Reasoning With Neural Tensor Networks for Knowledge Base Completion , 2013, NIPS.

[24]  Xavier Bresson,et al.  Convolutional Neural Networks on Graphs with Fast Localized Spectral Filtering , 2016, NIPS.

[25]  Marco Gori,et al.  LYRICS: a General Interface Layer to Integrate AI and Deep Learning , 2019, ArXiv.

[26]  Yuanzhuo Wang,et al.  Locally Adaptive Translation for Knowledge Graph Embedding , 2015, AAAI.

[27]  Armando Solar-Lezama,et al.  Learning Libraries of Subroutines for Neurally-Guided Bayesian Program Induction , 2018, NeurIPS.

[28]  Jason Weston,et al.  Joint Learning of Words and Meaning Representations for Open-Text Semantic Parsing , 2012, AISTATS.

[29]  Jure Leskovec,et al.  GraphRNN: Generating Realistic Graphs with Deep Auto-regressive Models , 2018, ICML.

[30]  Christopher M. Bishop,et al.  Current address: Microsoft Research, , 2022 .

[31]  Jure Leskovec,et al.  GraphRNN: A Deep Generative Model for Graphs , 2018, ICML 2018.

[32]  Pedro M. Domingos,et al.  Statistical predicate invention , 2007, ICML '07.

[33]  Ah Chung Tsoi,et al.  The Graph Neural Network Model , 2009, IEEE Transactions on Neural Networks.

[34]  Jean Christoph Jung,et al.  Quantified Markov Logic Networks , 2018, KR.

[35]  Uffe Kjærulff,et al.  Blocking Gibbs sampling in very large probabilistic expert systems , 1995, Int. J. Hum. Comput. Stud..

[36]  Jason Weston,et al.  Learning Structured Embeddings of Knowledge Bases , 2011, AAAI.

[37]  Geoffrey E. Hinton Training Products of Experts by Minimizing Contrastive Divergence , 2002, Neural Computation.

[38]  Marco Gori,et al.  Semantic-based regularization for learning and inference , 2017, Artif. Intell..

[39]  Praveen Paritosh,et al.  Freebase: a collaboratively created graph database for structuring human knowledge , 2008, SIGMOD Conference.

[40]  Paolo Frasconi,et al.  Prediction of protein beta-residue contacts by Markov logic networks with grounding-specific weights , 2009, Bioinform..

[41]  Jeffrey Dean,et al.  Distributed Representations of Words and Phrases and their Compositionality , 2013, NIPS.

[42]  Jason Weston,et al.  Translating Embeddings for Modeling Multi-relational Data , 2013, NIPS.

[43]  Tim Rocktäschel,et al.  End-to-end Differentiable Proving , 2017, NIPS.

[44]  Edward Grefenstette,et al.  Differentiable Reasoning on Large Knowledge Bases and Natural Language , 2019, Knowledge Graphs for eXplainable Artificial Intelligence.

[45]  George A. Miller,et al.  WordNet: A Lexical Database for English , 1995, HLT.

[46]  Artur S. d'Avila Garcez,et al.  Logic Tensor Networks: Deep Learning and Logical Reasoning from Data and Knowledge , 2016, NeSy@HLAI.

[47]  Jesse Davis,et al.  Markov Logic Networks for Knowledge Base Completion: A Theoretical Analysis Under the MCAR Assumption , 2019, UAI.

[48]  Koray Kavukcuoglu,et al.  Learning word embeddings efficiently with noise-contrastive estimation , 2013, NIPS.

[49]  Zhao Zhang,et al.  Knowledge Graph Embedding with Hierarchical Relation Structure , 2018, EMNLP.