Context-aware Entity Typing in Knowledge Graphs

Knowledge graph entity typing aims to infer entities’ missing types in knowledge graphs which is an important but under-explored issue. This paper proposes a novel method for this task by utilizing entities’ contextual information. Specifically, we design two inference mechanisms: i) N2T: independently use each neighbor of an entity to infer its type; ii) Agg2T: aggregate the neighbors of an entity to infer its type. Those mechanisms will produce multiple inference results, and an exponentially weighted pooling method is used to generate the final inference result. Furthermore, we propose a novel loss function to alleviate the false-negative problem during training. Experiments on two real-world KGs demonstrate the effectiveness of our method. The source code and data of this paper can be obtained from https://github.com/ CCIIPLab/CET.

[1]  Bowen Zhou,et al.  End-to-end Structure-Aware Convolutional Networks for Knowledge Base Completion , 2018, AAAI.

[2]  Xiaojie Wang,et al.  Connecting Embeddings for Knowledge Graph Entity Typing , 2020, ACL.

[3]  Gerhard Weikum,et al.  WWW 2007 / Track: Semantic Web Session: Ontologies ABSTRACT YAGO: A Core of Semantic Knowledge , 2022 .

[4]  Jason Weston,et al.  Question Answering with Subgraph Embeddings , 2014, EMNLP.

[5]  Dan Roth,et al.  Entity Linking via Joint Encoding of Types, Descriptions, and Context , 2017, EMNLP.

[6]  Ming-Wei Chang,et al.  Inferring Missing Entity Type Instances for Knowledge Base Completion: New Dataset and Methods , 2015, NAACL.

[7]  Praveen Paritosh,et al.  Freebase: a collaboratively created graph database for structuring human knowledge , 2008, SIGMOD Conference.

[8]  Zhiyuan Liu,et al.  Differentiating Concepts and Instances for Knowledge Graph Embedding , 2018, EMNLP.

[9]  Steve Harenberg,et al.  Learning Contextual Embeddings for Knowledge Graph Completion , 2017, PACIS.

[10]  Yizhou Sun,et al.  Universal Representation Learning of Knowledge Bases by Jointly Embedding Instances and Ontological Concepts , 2019, KDD.

[11]  Guillaume Bouchard,et al.  Complex Embeddings for Simple Link Prediction , 2016, ICML.

[12]  Alexandros Stergiou,et al.  Refining activation downsampling with SoftPool , 2021, 2021 IEEE/CVF International Conference on Computer Vision (ICCV).

[13]  Jason Weston,et al.  Translating Embeddings for Modeling Multi-relational Data , 2013, NIPS.

[14]  Dan Roth,et al.  Zero-Shot Open Entity Typing as Type-Compatible Grounding , 2019, EMNLP.

[15]  Hans-Peter Kriegel,et al.  A Three-Way Model for Collective Learning on Multi-Relational Data , 2011, ICML.

[16]  Daniel S. Weld,et al.  Fine-Grained Entity Recognition , 2012, AAAI.

[17]  Tiansi Dong,et al.  Attributed and Predictive Entity Embedding for Fine-Grained Entity Typing in Knowledge Bases , 2018, COLING.

[18]  Lorenzo Rosasco,et al.  Holographic Embeddings of Knowledge Graphs , 2015, AAAI.

[19]  Jimmy Ba,et al.  Adam: A Method for Stochastic Optimization , 2014, ICLR.

[20]  Nagiza F. Samatova,et al.  Learning Entity Type Embeddings for Knowledge Graph Completion , 2017, CIKM.

[21]  Nicolas Usunier,et al.  Canonical Tensor Decomposition for Knowledge Base Completion , 2018, ICML.

[22]  G. Karypis,et al.  DGL-KE: Training Knowledge Graph Embeddings at Scale , 2020, SIGIR.

[23]  Max Welling,et al.  Modeling Relational Data with Graph Convolutional Networks , 2017, ESWC.

[24]  Omer Levy,et al.  Ultra-Fine Entity Typing , 2018, ACL.

[25]  Jian-Yun Nie,et al.  RotatE: Knowledge Graph Embedding by Relational Rotation in Complex Space , 2018, ICLR.