Incorporating Instance Correlations in Distantly Supervised Relation Extraction

Distantly-supervised relation extraction has proven to be effective to find relational facts from texts. However, the existing approaches treat the instances in the same bag independently and ignore the semantic structural information. In this paper, we propose a graph convolution network (GCN) model with an attention mechanism to improve relation extraction. For each bag, the model first builds a graph through the dependency tree of each instance in this bag. In this way, the correlations between instances are built through their common words. The learned node (word) embeddings which encode the bag information are then fed into the sentence encoder, i.e., text CNN to obtain better representations of sentences. Besides, an instance-level attention mechanism is introduced to select valid instances and learn the textual relation embedding. Finally, the learned embedding is used to train our relation classifier. Experiments on two benchmark datasets demonstrate that our model significantly outperforms the compared baselines.

[1]  Joan Bruna,et al.  Spectral Networks and Locally Connected Networks on Graphs , 2013, ICLR.

[2]  Andy Way,et al.  Multi-Level Structured Self-Attentions for Distantly Supervised Relation Extraction , 2018, EMNLP.

[3]  Zhiyuan Liu,et al.  Hierarchical Relation Extraction with Coarse-to-Fine Grained Attention , 2018, EMNLP.

[4]  Zhiyuan Liu,et al.  Neural Relation Extraction with Selective Attention over Instances , 2016, ACL.

[5]  Xavier Bresson,et al.  Convolutional Neural Networks on Graphs with Fast Localized Spectral Filtering , 2016, NIPS.

[6]  Andrew McCallum,et al.  Modeling Relations and Their Mentions without Labeled Text , 2010, ECML/PKDD.

[7]  Jun Zhao,et al.  Relation Classification via Convolutional Deep Neural Network , 2014, COLING.

[8]  Hao Shao,et al.  Syntax-aware entity representations for neural relation extraction , 2019, Artif. Intell..

[9]  Wei Zhang,et al.  SEE: Syntax-aware Entity Embedding for Neural Relation Extraction , 2018, AAAI.

[10]  Liyuan Liu,et al.  Cross-relation Cross-bag Attention for Distantly-supervised Relation Extraction , 2018, AAAI.

[11]  Ramesh Nallapati,et al.  Multi-instance Multi-label Learning for Relation Extraction , 2012, EMNLP.

[12]  Daniel Jurafsky,et al.  Distant supervision for relation extraction without labeled data , 2009, ACL.

[13]  Chiranjib Bhattacharyya,et al.  RESIDE: Improving Distantly-Supervised Neural Relation Extraction using Side Information , 2018, EMNLP.

[14]  Luke S. Zettlemoyer,et al.  Knowledge-Based Weak Supervision for Information Extraction of Overlapping Relations , 2011, ACL.

[15]  Zhiyuan Liu,et al.  Incorporating Relation Paths in Neural Relation Extraction , 2016, EMNLP.

[16]  Khalil Sima'an,et al.  Graph Convolutional Encoders for Syntax-aware Neural Machine Translation , 2017, EMNLP.

[17]  Xiao Liu,et al.  Distant Supervision for Relation Extraction with Linear Attenuation Simulation and Non-IID Relevance Embedding , 2019, AAAI.

[18]  Jun Zhao,et al.  Distant Supervision for Relation Extraction via Piecewise Convolutional Neural Networks , 2015, EMNLP.

[19]  Zhiyuan Liu,et al.  Neural Knowledge Acquisition via Mutual Attention Between Knowledge Graph and Text , 2018, AAAI.

[20]  Yuan Luo,et al.  Graph Convolutional Networks for Text Classification , 2018, AAAI.

[21]  Jun Zhao,et al.  Distant Supervision for Relation Extraction with Sentence-Level Attention and Entity Descriptions , 2017, AAAI.