Integrating Relation Constraints with Neural Relation Extractors

Recent years have seen rapid progress in identifying predefined relationship between entity pairs using neural networks NNs. However, such models often make predictions for each entity pair individually, thus often fail to solve the inconsistency among different predictions, which can be characterized by discrete relation constraints. These constraints are often defined over combinations of entity-relation-entity triples, since there often lack of explicitly well-defined type and cardinality requirements for the relations. In this paper, we propose a unified framework to integrate relation constraints with NNs by introducing a new loss term, ConstraintLoss. Particularly, we develop two efficient methods to capture how well the local predictions from multiple instance pairs satisfy the relation constraints. Experiments on both English and Chinese datasets show that our approach can help NNs learn from discrete relation constraints to reduce inconsistency among local predictions, and outperform popular neural relation extraction NRE models even enhanced with extra post-processing. Our source code and datasets will be released at this https URL.

[1]  Jun Zhao,et al.  Distant Supervision for Relation Extraction via Piecewise Convolutional Neural Networks , 2015, EMNLP.

[2]  Dongyan Zhao,et al.  Encoding implicit relation requirements for relation extraction: A joint inference approach , 2018, Artif. Intell..

[3]  Guy Van den Broeck,et al.  A Semantic Loss Function for Deep Learning with Symbolic Knowledge , 2017, ICML.

[4]  Zhiyuan Liu,et al.  Neural Relation Extraction with Selective Attention over Instances , 2016, ACL.

[5]  Bowen Zhou,et al.  Improved Neural Relation Detection for Knowledge Base Question Answering , 2017, ACL.

[6]  Ramesh Nallapati,et al.  Multi-instance Multi-label Learning for Relation Extraction , 2012, EMNLP.

[7]  Xinyan Xiao,et al.  ARNOR: Attention Regularization based Noise Reduction for Distant Supervision Relation Classification , 2019, ACL.

[8]  Andrew McCallum,et al.  Modeling Relations and Their Mentions without Labeled Text , 2010, ECML/PKDD.

[9]  Eric P. Xing,et al.  Harnessing Deep Neural Networks with Logic Rules , 2016, ACL.

[10]  Daniel Jurafsky,et al.  Distant supervision for relation extraction without labeled data , 2009, ACL.

[11]  G. G. Stokes "J." , 1890, The New Yale Book of Quotations.

[12]  Chiranjib Bhattacharyya,et al.  RESIDE: Improving Distantly-Supervised Neural Relation Extraction using Side Information , 2018, EMNLP.

[13]  Luke S. Zettlemoyer,et al.  Knowledge-Based Weak Supervision for Information Extraction of Overlapping Relations , 2011, ACL.

[14]  Fabian M. Suchanek Advances in Automated Knowledge Base Construction , 2013 .

[15]  William Yang Wang,et al.  DSGAN: Generative Adversarial Training for Distant Supervision Relation Extraction , 2018, ACL.

[16]  Dongyan Zhao,et al.  Lattice CNNs for Matching Based Chinese Question Answering , 2019, AAAI.

[17]  Dongyan Zhao,et al.  Marrying Up Regular Expressions with Neural Networks: A Case Study for Spoken Language Understanding , 2018, ACL.

[18]  Dongyan Zhao,et al.  Question Answering on Freebase via Relation Extraction and Textual Evidence , 2016, ACL.

[19]  Li Zhao,et al.  Reinforcement Learning for Relation Classification From Noisy Data , 2018, AAAI.

[20]  Jun Zhao,et al.  Relation Classification via Convolutional Deep Neural Network , 2014, COLING.

[21]  Xuegang Hu,et al.  Entity Linking: An Issue to Extract Corresponding Entity With Knowledge Base , 2018, IEEE Access.

[22]  Wei Xu,et al.  CFO: Conditional Focused Neural Question Answering with Large-scale Knowledge Bases , 2016, ACL.

[23]  Wei Zhang,et al.  SEE: Syntax-aware Entity Embedding for Neural Relation Extraction , 2018, AAAI.

[24]  Danna Zhou,et al.  d. , 1934, Microbial pathogenesis.

[26]  Jens Lehmann,et al.  DBpedia - A crystallization point for the Web of Data , 2009, J. Web Semant..

[27]  Jun Zhao,et al.  Distant Supervision for Relation Extraction with Sentence-Level Attention and Entity Descriptions , 2017, AAAI.

[28]  P ? ? ? ? ? ? ? % ? ? ? ? , 1991 .