A Simple but Effective Bidirectional Framework for Relational Triple Extraction

Tagging based relational triple extraction methods are attracting growing research attention recently. However, most of these methods take a unidirectional extraction framework that first extracts all subjects and then extracts objects and relations simultaneously based on the subjects extracted. This framework has an obvious deficiency that it is too sensitive to the extraction results of subjects. To overcome this deficiency, we propose a bidirectional extraction framework based method that extracts triples based on the entity pairs extracted from two complementary directions. Concretely, we first extract all possible subject-object pairs from two paralleled directions. These two extraction directions are connected by a shared encoder component, thus the extraction features from one direction can flow to another direction and vice versa. By this way, the extractions of two directions can boost and complement each other. Next, we assign all possible relations for each entity pair by a biaffine model. During training, we observe that the share structure will lead to a convergence rate inconsistency issue which is harmful to performance. So we propose a share-aware learning mechanism to address it. We evaluate the proposed model on multiple benchmark datasets. Extensive experimental results show that the proposed model is very effective and it achieves state-of-the-art results on all of these datasets. Moreover, experiments show that both the proposed bidirectional extraction framework and the share-aware learning mechanism have good adaptability and can be used to improve the performance of other tagging based methods. The source code of our work is available at: https://github.com/neukg/BiRTE.

[1]  Luke S. Zettlemoyer,et al.  Knowledge-Based Weak Supervision for Information Extraction of Overlapping Relations , 2011, ACL.

[2]  Huajun Chen,et al.  Contrastive Triple Extraction with Generative Transformer , 2020, ArXiv.

[3]  A. Ulges,et al.  Span-based Joint Entity and Relation Extraction with Transformer Pre-training , 2019, ECAI.

[4]  Daojian Zeng,et al.  CopyMTL: Copy Mechanism for Joint Extraction of Entities and Relations with Multi-Task Learning , 2019, AAAI.

[5]  Tianyang Zhang,et al.  A Hierarchical Framework for Relation Extraction with Reinforcement Learning , 2018, AAAI.

[6]  Peng Zhou,et al.  Joint Extraction of Entities and Relations Based on a Novel Tagging Scheme , 2017, ACL.

[7]  Xiaofei Zhou,et al.  A Relation-Specific Attention Network for Joint Entity and Relation Extraction , 2020, IJCAI.

[8]  Dan Roth,et al.  Exploiting Syntactico-Semantic Structures for Relation Extraction , 2011, ACL.

[9]  Richong Zhang,et al.  Recurrent Interaction Network for Jointly Extracting Entities and Classifying Relations , 2020, EMNLP.

[10]  Feng Liu,et al.  StereoRel: Relational Triple Extraction from a Stereoscopic Perspective , 2021, ACL/IJCNLP.

[11]  Hwee Tou Ng,et al.  Effective Modeling of Encoder-Decoder Architecture for Joint Entity and Relation Extraction , 2019, AAAI.

[12]  Andrew McCallum,et al.  Modeling Relations and Their Mentions without Labeled Text , 2010, ECML/PKDD.

[13]  Yue Zhang,et al.  End-to-End Neural Relation Extraction with Global Optimization , 2017, EMNLP.

[14]  Shashi Narayan,et al.  Creating Training Corpora for NLG Micro-Planners , 2017, ACL.

[15]  Wei-Yun Ma,et al.  GraphRel: Modeling Text as Relational Graphs for Joint Entity and Relation Extraction , 2019, ACL.

[16]  Makoto Miwa,et al.  End-to-End Relation Extraction using LSTMs on Sequences and Tree Structures , 2016, ACL.

[17]  Ming-Wei Chang,et al.  BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding , 2019, NAACL.

[18]  Timothy Dozat,et al.  Deep Biaffine Attention for Neural Dependency Parsing , 2016, ICLR.

[19]  Chris Develder,et al.  Joint entity recognition and relation extraction as a multi-head selection problem , 2018, Expert Syst. Appl..

[20]  Yongyi Mao,et al.  Progressive Multi-task Learning with Controlled Information Flow for Joint Entity and Relation Extraction , 2021, AAAI.

[21]  Zhepei Wei,et al.  A Novel Cascade Binary Tagging Framework for Relational Triple Extraction , 2019, ACL.

[22]  Shizhu He,et al.  Learning the Extraction Order of Multiple Relational Facts in a Sentence with Reinforcement Learning , 2019, EMNLP.

[23]  Philipp Koehn,et al.  Proceedings of the 54th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers) , 2016 .

[24]  Dmitry Zelenko,et al.  Kernel Methods for Relation Extraction , 2002, J. Mach. Learn. Res..

[25]  Bin Wang,et al.  Joint Extraction of Entities and Relations Based on a Novel Decomposition Strategy , 2020, ECAI.

[26]  Jeffrey Pennington,et al.  GloVe: Global Vectors for Word Representation , 2014, EMNLP.

[27]  Jian Su,et al.  Exploring Various Knowledge in Relation Extraction , 2005, ACL.

[28]  Xiaodong Liu,et al.  Unified Language Model Pre-training for Natural Language Understanding and Generation , 2019, NeurIPS.

[29]  Jimmy Ba,et al.  Adam: A Method for Stochastic Optimization , 2014, ICLR.

[30]  Yongfeng Huang,et al.  Jointly Extracting Explicit and Implicit Relational Triples with Reasoning Pattern Enhanced Binary Pointer Network , 2021, NAACL.

[31]  Jun Zhao,et al.  Extracting Relational Facts by an End-to-End Neural Model with Copy Mechanism , 2018, ACL.

[32]  G. Papavassilopoulos,et al.  Biaffine matrix inequality properties and computational methods , 1994, Proceedings of 1994 American Control Conference - ACC '94.

[33]  Hongsong Zhu,et al.  TPLinker: Single-stage Joint Extraction of Entities and Relations Through Token Pair Linking , 2020, COLING.

[34]  Hinrich Schütze,et al.  Table Filling Multi-Task Recurrent Neural Network for Joint Entity and Relation Extraction , 2016, COLING.

[35]  Yifan Yang,et al.  PRGC: Potential Relation and Global Correspondence Based Joint Relational Triple Extraction , 2021, ACL.