Extracting Multiple-Relations in One-Pass with Pre-Trained Transformers

Most approaches to extraction multiple relations from a paragraph require multiple passes over the paragraph. In practice, multiple passes are computationally expensive and this makes difficult to scale to longer paragraphs and larger text corpora. In this work, we focus on the task of multiple relation extraction by encoding the paragraph only once (one-pass). We build our solution on the pre-trained self-attentive (Transformer) models, where we first add a structured prediction layer to handle extraction between multiple entity pairs, then enhance the paragraph embedding to capture multiple relational information associated with each entity with an entity-aware attention technique. We show that our approach is not only scalable but can also perform state-of-the-art on the standard benchmark ACE 2005.

[1]  Mark Dredze,et al.  Improved Relation Extraction with Feature-Rich Compositional Embedding Models , 2015, EMNLP.

[2]  Yue Zhang,et al.  Exploring Graph-structured Passage Representation for Multi-hop Reading Comprehension with Graph Neural Networks , 2018, ArXiv.

[3]  Behrang Q. Zadeh,et al.  SemEval-2018 Task 7: Semantic Relation Extraction and Classification in Scientific Papers , 2018, *SEMEVAL.

[4]  Andrew McCallum,et al.  Simultaneously Self-Attending to All Mentions for Full-Abstract Biological Relation Extraction , 2018, NAACL.

[5]  Ashish Vaswani,et al.  Self-Attention with Relative Position Representations , 2018, NAACL.

[6]  Nanyun Peng,et al.  On Difficulties of Cross-Lingual Transfer with Order Differences: A Case Study on Dependency Parsing , 2018, NAACL.

[7]  Dongyan Zhao,et al.  Semantic Relation Classification via Convolutional Neural Networks with Simple Negative Sampling , 2015, EMNLP.

[8]  Chuang Gan,et al.  The Neuro-Symbolic Concept Learner: Interpreting Scenes Words and Sentences from Natural Supervision , 2019, ICLR.

[9]  Preslav Nakov,et al.  SemEval-2010 Task 8: Multi-Way Classification of Semantic Relations Between Pairs of Nominals , 2009, SEW@NAACL-HLT.

[10]  Nora Hollenstein,et al.  ETH-DS3Lab at SemEval-2018 Task 7: Effectively Combining Recurrent and Convolutional Neural Networks for Relation Classification and Extraction , 2018, *SEMEVAL.

[11]  Dongyan Zhao,et al.  Question Answering on Freebase via Relation Extraction and Textual Evidence , 2016, ACL.

[12]  Andrew McCallum,et al.  Relation Extraction with Matrix Factorization and Universal Schemas , 2013, NAACL.

[13]  Nanyun Peng,et al.  Near or Far, Wide Range Zero-Shot Cross-Lingual Dependency Parsing , 2018, ArXiv.

[14]  Farhad Nooralahzadeh,et al.  SIRIUS-LTG-UiO at SemEval-2018 Task 7: Convolutional Neural Networks with Shortest Dependency Paths for Semantic Relation Extraction and Classification in Scientific Papers , 2018, SemEval@NAACL-HLT.

[15]  Fabio Petroni,et al.  CORE: Context-Aware Open Relation Extraction with Factorization Machines , 2015, EMNLP.

[16]  Ralph Grishman,et al.  Domain Adaptation for Relation Extraction with Domain Adversarial Neural Network , 2017, IJCNLP.

[17]  Timothy Dozat,et al.  Simpler but More Accurate Semantic Dependency Parsing , 2018, ACL.

[18]  Heng Ji,et al.  Genre Separation Network with Adversarial Training for Cross-genre Relation Extraction , 2018, EMNLP.

[19]  Iryna Gurevych,et al.  Context-Aware Representations for Knowledge Base Relation Extraction , 2017, EMNLP.

[20]  Alec Radford,et al.  Improving Language Understanding by Generative Pre-Training , 2018 .

[21]  Yue Zhang,et al.  N-ary Relation Extraction using Graph-State LSTM , 2018, EMNLP.

[22]  Andrew McCallum,et al.  Multilingual Relation Extraction using Compositional Universal Schema , 2015, NAACL.

[23]  Ming-Wei Chang,et al.  Semantic Parsing via Staged Query Graph Generation: Question Answering with Knowledge Base , 2015, ACL.

[24]  Gerhard Weikum,et al.  Senti-LSSVM: Sentiment-Oriented Multi-Relation Extraction with Latent Structural SVM , 2014, Transactions of the Association for Computational Linguistics.

[25]  Trevor Darrell,et al.  Language-Conditioned Graph Networks for Relational Reasoning , 2019, 2019 IEEE/CVF International Conference on Computer Vision (ICCV).

[26]  Richard Socher,et al.  Learned in Translation: Contextualized Word Vectors , 2017, NIPS.

[27]  Ramesh Nallapati,et al.  Multi-instance Multi-label Learning for Relation Extraction , 2012, EMNLP.

[28]  Zhiyuan Liu,et al.  Relation Classification via Multi-Level Attention CNNs , 2016, ACL.

[29]  Ming-Wei Chang,et al.  BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding , 2019, NAACL.

[30]  Mari Ostendorf,et al.  The UWNLP system at SemEval-2018 Task 7: Neural Relation Extraction Model with Selectively Incorporated Concept Embeddings , 2018, SemEval@NAACL-HLT.

[31]  Ralph Grishman,et al.  Combining Neural Networks and Log-linear Models to Improve Relation Extraction , 2015, ArXiv.

[32]  Bowen Zhou,et al.  Improved Neural Relation Detection for Knowledge Base Question Answering , 2017, ACL.

[33]  Luke S. Zettlemoyer,et al.  Deep Contextualized Word Representations , 2018, NAACL.