LibKGE - A knowledge graph embedding library for reproducible research

LibKGE ( https://github.com/uma-pi1/kge ) is an open-source PyTorch-based library for training, hyperparameter optimization, and evaluation of knowledge graph embedding models for link prediction. The key goals of LibKGE are to enable reproducible research, to provide a framework for comprehensive experimental studies, and to facilitate analyzing the contributions of individual components of training methods, model architectures, and evaluation methods. LibKGE is highly configurable and every experiment can be fully reproduced with a single configuration file. Individual components are decoupled to the extent possible so that they can be mixed and matched with each other. Implementations in LibKGE aim to be as efficient as possible without leaving the scope of Python/Numpy/PyTorch. A comprehensive logging mechanism and tooling facilitates in-depth analysis. LibKGE provides implementations of common knowledge graph embedding models and training methods, and new ones can be easily added. A comparative study (Ruffinelli et al., 2020) showed that LibKGE reaches competitive to state-of-the-art performance for many models with a modest amount of automatic hyperparameter tuning.

[1]  Guillaume Bouchard,et al.  Complex Embeddings for Simple Link Prediction , 2016, ICML.

[2]  Roy Schwartz,et al.  Knowledge Enhanced Contextual Word Representations , 2019, EMNLP/IJCNLP.

[3]  Pasquale Minervini,et al.  Convolutional 2D Knowledge Graph Embeddings , 2017, AAAI.

[4]  Zheng Zhang,et al.  DGL-KE: Training Knowledge Graph Embeddings at Scale , 2020, SIGIR.

[5]  Zhendong Mao,et al.  Knowledge Graph Embedding: A Survey of Approaches and Applications , 2017, IEEE Transactions on Knowledge and Data Engineering.

[6]  Jason Weston,et al.  Translating Embeddings for Modeling Multi-relational Data , 2013, NIPS.

[7]  Zhiyuan Liu,et al.  OpenKE: An Open Toolkit for Knowledge Embedding , 2018, EMNLP.

[8]  Jian-Yun Nie,et al.  RotatE: Knowledge Graph Embedding by Relational Rotation in Complex Space , 2018, ICLR.

[9]  Rainer Gemulla,et al.  Can We Predict New Facts with Open Knowledge Graph Embeddings? A Benchmark for Open Link Prediction , 2020, ACL.

[10]  Nicolas Usunier,et al.  Canonical Tensor Decomposition for Knowledge Base Completion , 2018, ICML.

[11]  Yiming Yang,et al.  A Re-evaluation of Knowledge Graph Completion Methods , 2019, ACL.

[12]  Jian Tang,et al.  GraphVite: A High-Performance CPU-GPU Hybrid System for Node Embedding , 2019, WWW.

[13]  Sameer Singh,et al.  Embedding Multimodal Relational Data for Knowledge Base Completion , 2018, EMNLP.

[14]  Volker Tresp,et al.  Bringing Light Into the Dark: A Large-Scale Evaluation of Knowledge Graph Embedding Models Under a Unified Framework , 2020, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[15]  Volker Tresp,et al.  Improving Visual Relationship Detection Using Semantic Modeling of Scene Descriptions , 2017, SEMWEB.

[16]  Alexander Peysakhovich,et al.  PyTorch-BigGraph: A Large-scale Graph Embedding System , 2019, SysML.

[17]  Tianyu Gao,et al.  KEPLER: A Unified Model for Knowledge Embedding and Pre-trained Language Representation , 2019, ArXiv.

[18]  Danqi Chen,et al.  Observed versus latent features for knowledge base and text inference , 2015, CVSC.

[19]  Rainer Gemulla,et al.  You CAN Teach an Old Dog New Tricks! On Training Knowledge Graph Embeddings , 2020, ICLR.

[20]  Christopher R'e,et al.  Low-Dimensional Hyperbolic Knowledge Graph Embeddings , 2020, ACL.

[21]  Evgeniy Gabrilovich,et al.  A Review of Relational Machine Learning for Knowledge Graphs , 2015, Proceedings of the IEEE.

[22]  Vít Novácek,et al.  Drug target discovery using knowledge graph embeddings , 2019, SAC.