A Comparative Study of Distributional and Symbolic Paradigms for Relational Learning

Many real-world domains can be expressed as graphs and, more generally, as multi-relational knowledge graphs. Though reasoning and learning with knowledge graphs has traditionally been addressed by symbolic approaches, recent methods in (deep) representation learning has shown promising results for specialized tasks such as knowledge base completion. These approaches abandon the traditional symbolic paradigm by replacing symbols with vectors in Euclidean space. With few exceptions, symbolic and distributional approaches are explored in different communities and little is known about their respective strengths and weaknesses. In this work, we compare representation learning and relational learning on various relational classification and clustering tasks and analyse the complexity of the rules used implicitly by these approaches. Preliminary results reveal possible indicators that could help in choosing one approach over the other for particular knowledge graphs.

[1]  Guillaume Bouchard,et al.  On Inductive Abilities of Latent Factor Models for Relational Learning , 2017, J. Artif. Intell. Res..

[2]  Andrew McCallum,et al.  Introduction to Statistical Relational Learning , 2007 .

[3]  Zhendong Mao,et al.  Knowledge Graph Embedding: A Survey of Approaches and Applications , 2017, IEEE Transactions on Knowledge and Data Engineering.

[4]  Luc De Raedt,et al.  ProbLog: A Probabilistic Prolog and its Application in Link Discovery , 2007, IJCAI.

[5]  Jianfeng Gao,et al.  Embedding Entities and Relations for Learning and Inference in Knowledge Bases , 2014, ICLR.

[6]  Max Welling,et al.  Modeling Relational Data with Graph Convolutional Networks , 2017, ESWC.

[7]  Luc De Raedt,et al.  kFOIL: Learning Simple Relational Kernels , 2006, AAAI.

[8]  Rudolf Kadlec,et al.  Knowledge Base Completion: Baselines Strike Back , 2017, Rep4NLP@ACL.

[9]  Jure Leskovec,et al.  Representation Learning on Graphs: Methods and Applications , 2017, IEEE Data Eng. Bull..

[10]  Luc De Raedt,et al.  Logical and Relational Learning: From ILP to MRDM (Cognitive Technologies) , 2008 .

[11]  Guillaume Bouchard,et al.  Complex Embeddings for Simple Link Prediction , 2016, ICML.

[12]  Edward Grefenstette,et al.  Towards a Formal Distributional Semantics: Simulating Logical Calculi with Tensors , 2013, *SEMEVAL.

[13]  Luc De Raedt,et al.  DeepProbLog: Neural Probabilistic Logic Programming , 2018, BNAIC/BENELEARN.

[14]  Thomas Demeester,et al.  Adversarial Sets for Regularising Neural Link Predictors , 2017, UAI.

[15]  L. Getoor,et al.  Sparsity and Noise: Where Knowledge Graph Embeddings Fall Short , 2017, EMNLP.

[16]  Hendrik Blockeel,et al.  Clustering-Based Relational Unsupervised Representation Learning with an Explicit Distributed Representation , 2016, IJCAI.

[17]  Guigang Zhang,et al.  Deep Learning , 2016, Int. J. Semantic Comput..

[18]  Hugo Larochelle,et al.  Proceedings of the 3rd Workshop on Continuous Vector Space Models and their Compositionality , 2015, CVSC.

[19]  Mathias Niepert,et al.  KBlrn: End-to-End Learning of Knowledge Base Representations with Latent, Relational, and Numerical Features , 2017, UAI.

[20]  Pasquale Minervini,et al.  Convolutional 2D Knowledge Graph Embeddings , 2017, AAAI.

[21]  Hendrik Blockeel,et al.  An expressive dissimilarity measure for relational clustering using neighbourhood trees , 2016, Machine Learning.

[22]  Nils J. Nilsson,et al.  Artificial Intelligence , 1974, IFIP Congress.

[23]  Lovekesh Vig,et al.  An Investigation into the Role of Domain-Knowledge on the Use of Embeddings , 2017, ILP.