Jointly Embedding Knowledge Graphs and Logical Rules

Embedding knowledge graphs into continuous vector spaces has recently attracted increasing interest. Most existing methods perform the embedding task using only fact triples. Logical rules, although containing rich background information, have not been well studied in this task. This paper proposes a novel method of jointly embedding knowledge graphs and logical rules. The key idea is to represent and model triples and rules in a unified framework. Specifically, triples are represented as atomic formulae and modeled by the translation assumption, while rules represented as complex formulae and modeled by t-norm fuzzy logics. Embedding then amounts to minimizing a global loss over both atomic and complex formulae. In this manner, we learn embeddings compatible not only with triples but also with rules, which will certainly be more predictive for knowledge acquisition and inference. We evaluate our method with link prediction and triple classification tasks. Experimental results show that joint embedding brings significant and consistent improvements over stateof-the-art methods. Particularly, it enhances the prediction of new facts which cannot even be directly inferred by pure logical inference, demonstrating the capability of our method to learn more predictive embeddings.

[1]  Nicolas Le Roux,et al.  A latent factor model for highly multi-relational data , 2012, NIPS.

[2]  Sameer Singh,et al.  Low-Dimensional Embeddings of Logic , 2014, ACL 2014.

[3]  Dejing Dou,et al.  Learning to Refine an Automatically Extracted Knowledge Base Using Markov Logic , 2012, 2012 IEEE 12th International Conference on Data Mining.

[4]  Sameer Singh,et al.  Injecting Logical Background Knowledge into Embeddings for Relation Extraction , 2015, NAACL.

[5]  Huanbo Luan,et al.  Modeling Relation Paths for Representation Learning of Knowledge Bases , 2015, EMNLP.

[6]  Thomas L. Griffiths,et al.  Learning Systems of Concepts with an Infinite Relational Model , 2006, AAAI.

[7]  Jason Weston,et al.  Translating Embeddings for Modeling Multi-relational Data , 2013, NIPS.

[8]  Lise Getoor,et al.  Knowledge Graph Identification , 2013, SEMWEB.

[9]  Zhiyuan Liu,et al.  Learning Entity and Relation Embeddings for Knowledge Graph Completion , 2015, AAAI.

[10]  George A. Miller,et al.  WordNet: A Lexical Database for English , 1995, HLT.

[11]  Luke S. Zettlemoyer,et al.  Knowledge-Based Weak Supervision for Information Extraction of Overlapping Relations , 2011, ACL.

[12]  Hans-Peter Kriegel,et al.  Infinite Hidden Relational Models , 2006, UAI.

[13]  Petr Hájek,et al.  Metamathematics of Fuzzy Logic , 1998, Trends in Logic.

[14]  Andrew McCallum,et al.  Relation Extraction with Matrix Factorization and Universal Schemas , 2013, NAACL.

[15]  William W. Cohen,et al.  Learning to Identify the Best Contexts for Knowledge-based WSD , 2015, EMNLP.

[16]  Jason Weston,et al.  Learning Structured Embeddings of Knowledge Bases , 2011, AAAI.

[17]  Zhen Wang,et al.  Knowledge Graph Embedding by Translating on Hyperplanes , 2014, AAAI.

[18]  Lorenzo Rosasco,et al.  Holographic Embeddings of Knowledge Graphs , 2015, AAAI.

[19]  Praveen Paritosh,et al.  Freebase: a collaboratively created graph database for structuring human knowledge , 2008, SIGMOD Conference.

[20]  Matthew Richardson,et al.  Markov logic networks , 2006, Machine Learning.

[21]  Zhen Wang,et al.  Aligning Knowledge and Text Embeddings by Entity Descriptions , 2015, EMNLP.

[22]  Hans-Peter Kriegel,et al.  Factorizing YAGO: scalable machine learning for linked data , 2012, WWW.

[23]  Thomas Demeester,et al.  Regularizing Relation Representations by First-order Implications , 2016, AKBC@NAACL-HLT.

[24]  Li Guo,et al.  Semantically Smooth Knowledge Graph Embedding , 2015, ACL.

[25]  Joshua B. Tenenbaum,et al.  Modelling Relational Data using Bayesian Clustered Tensor Factorization , 2009, NIPS.

[26]  Jason Weston,et al.  Connecting Language and Knowledge Bases with Embedding Models for Relation Extraction , 2013, EMNLP.

[27]  Danqi Chen,et al.  Reasoning With Neural Tensor Networks for Knowledge Base Completion , 2013, NIPS.

[28]  Hans-Peter Kriegel,et al.  A Three-Way Model for Collective Learning on Multi-Relational Data , 2011, ICML.

[29]  Volker Tresp,et al.  Type-Constrained Representation Learning in Knowledge Graphs , 2015, SEMWEB.

[30]  Kai-Wei Chang,et al.  Typed Tensor Decomposition of Knowledge Bases for Relation Extraction , 2014, EMNLP.

[31]  Geoffrey Zweig,et al.  Linguistic Regularities in Continuous Space Word Representations , 2013, NAACL.

[32]  G. Stamou,et al.  Reasoning with Very Expressive Fuzzy Description Logics , 2007, J. Artif. Intell. Res..

[33]  Raymond J. Mooney,et al.  Efficient Markov Logic Inference for Natural Language Semantics , 2014, StarAI@AAAI.

[34]  Andrew McCallum,et al.  Compositional Vector Space Models for Knowledge Base Completion , 2015, ACL.

[35]  Lise Getoor,et al.  Probabilistic Similarity Logic , 2010, UAI.

[36]  Zhenyu Qi,et al.  Large-scale Knowledge Base Completion: Inferring via Grounding Network Sampling over Selected Instances , 2015, CIKM.

[37]  Li Guo,et al.  Knowledge Base Completion Using Embeddings and Rules , 2015, IJCAI.

[38]  Jason Weston,et al.  A semantic matching energy function for learning with multi-relational data , 2013, Machine Learning.

[39]  Li Guo,et al.  Context-Dependent Knowledge Graph Embedding , 2015, EMNLP.