RDSGAN: Rank-based Distant Supervision Relation Extraction with Generative Adversarial Framework

Distant supervision has been widely used for relation extraction but suffers from noise labeling problem. Neural network models are proposed to denoise with attention mechanism but cannot eliminate noisy data due to its non-zero weights. Hard decision is proposed to remove wrongly-labeled instances from the positive set though causes loss of useful information contained in removed instances. In this paper, we propose a novel generative neural framework named RDSGAN (Rank-based Distant Supervision GAN) which automatically generates valid instances for distant supervision relation extraction. Our framework combines soft attention and hard decision to learn the distribution of true positive instances via adversarial training and selects valid instances conforming to the distribution via rank-based distant supervision, which addresses the false positive problem. Experimental results show the superiority of our framework over strong baselines.

[1]  Zhiyuan Liu,et al.  Hybrid Attention-Based Prototypical Networks for Noisy Few-Shot Relation Classification , 2019, AAAI.

[2]  Wei Zhang,et al.  Attention-Based Capsule Networks with Dynamic Routing for Relation Extraction , 2018, EMNLP.

[3]  Jimmy J. Lin,et al.  Simple BERT Models for Relation Extraction and Semantic Role Labeling , 2019, ArXiv.

[4]  William Yang Wang,et al.  Robust Distant Supervision Relation Extraction via Deep Reinforcement Learning , 2018, ACL.

[5]  Zhiyuan Liu,et al.  Neural Relation Extraction with Selective Attention over Instances , 2016, ACL.

[6]  Ming-Wei Chang,et al.  BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding , 2019, NAACL.

[7]  Lantao Yu,et al.  SeqGAN: Sequence Generative Adversarial Nets with Policy Gradient , 2016, AAAI.

[8]  Hua Wang,et al.  Dilated Convolutional Networks Incorporating Soft Entity Type Constraints for Distant Supervised Relation Extraction , 2019, 2019 International Joint Conference on Neural Networks (IJCNN).

[9]  Daniel Jurafsky,et al.  Distant supervision for relation extraction without labeled data , 2009, ACL.

[10]  Andy Way,et al.  Multi-Level Structured Self-Attentions for Distantly Supervised Relation Extraction , 2018, EMNLP.

[11]  Hai Zhao,et al.  GAN Driven Semi-distant Supervision for Relation Extraction , 2019, NAACL.

[12]  Zhiyuan Liu,et al.  Hierarchical Relation Extraction with Coarse-to-Fine Grained Attention , 2018, EMNLP.

[13]  Alan Ritter,et al.  Structured Minimally Supervised Learning for Neural Relation Extraction , 2019, NAACL.

[14]  Luke S. Zettlemoyer,et al.  Knowledge-Based Weak Supervision for Information Extraction of Overlapping Relations , 2011, ACL.

[15]  William Yang Wang,et al.  DSGAN: Generative Adversarial Training for Distant Supervision Relation Extraction , 2018, ACL.

[16]  Yansong Feng,et al.  Easy First Relation Extraction with Information Redundancy , 2019, EMNLP.

[17]  Yun Zhao,et al.  An Auxiliary Classifier Generative Adversarial Framework for Relation Extraction , 2019, ArXiv.

[18]  Jeffrey Ling,et al.  Matching the Blanks: Distributional Similarity for Relation Learning , 2019, ACL.

[19]  Andrew McCallum,et al.  Modeling Relations and Their Mentions without Labeled Text , 2010, ECML/PKDD.

[20]  Waleed Ammar,et al.  Combining Distant and Direct Supervision for Neural Relation Extraction , 2019, NAACL-HLT.