Relation classification via knowledge graph enhanced transformer encoder

Abstract Relation classification is an important task in natural language processing fields. The goal is to predict predefined relations for the marked nominal pairs in given sentences. State-of-the-art works usually focus on using deep neural networks as classifier to conduct the relation prediction. The rich semantic information of relationships in the triples of existing knowledge graph (KG) can be used as additional supervision for relation classification. However, these relationships were simply used as labels to specify the class of sentences in previous works, and their semantic information was completely ignored. In this paper, a novel approach is proposed for relation classification, which jointly uses information from textual sentences and knowledge graphs. To this end, we introduce a Transformer encoder to measure the semantic similarity between sentences and relation types. Besides, we connect the semantic information of marked nominals in sentences with that of the corresponding entities in knowledge graph to generate the semantic matching information between textual relations and KG relations. The matching information can provide additional supervision for relation classification. Since the words and entities are used interactively with each other in our work, we propose an embedding translating strategy to handle the semantic gap problem between word embeddings and entity embeddings. Experimental results on two widely used datasets, SemEval-2010 Task 8 and TACRED, show that our approach is able to efficiently use the semantic information from the knowledge graph to enhance the performance of the Transformer encoder for relation classification.