An Embarrassingly Simple Model for Dialogue Relation Extraction

Dialogue relation extraction (RE) is to predict the relation type of two entities mentioned in a dialogue. In this paper, we model Dialogue RE as a multi-label classification task and propose a simple yet effective model named SimpleRE. SimpleRE captures the interrelations among multiple relations in a dialogue through a novel input format, BERT Relation Token Sequence (BRS). In BRS, multiple [CLS] tokens are used to capture different relations between different pairs of entities. A Relation Refinement Gate (RRG) is designed to extract relation-specific semantic representation adaptively. Experiments on DialogRE show that SimpleRE achieves the best performance with much shorter training time. SimpleRE outperforms all direct baselines on sentence-level RE without using external resources.

[1]  Tudor Dumitras,et al.  Shallow-Deep Networks: Understanding and Mitigating Network Overthinking , 2018, ICML.

[2]  Furu Wei,et al.  VL-BERT: Pre-training of Generic Visual-Linguistic Representations , 2019, ICLR.

[3]  Ruihan Bao,et al.  Group, Extract and Aggregate: Summarizing a Large Amount of Finance News for Forex Movement Prediction , 2019, EMNLP.

[4]  Roy Schwartz,et al.  Knowledge Enhanced Contextual Word Representations , 2019, EMNLP/IJCNLP.

[5]  Claire Cardie,et al.  Dialogue-Based Relation Extraction , 2020, ACL.

[6]  Yang Liu,et al.  Fine-tune BERT for Extractive Summarization , 2019, ArXiv.

[7]  Wei Lu,et al.  Attention Guided Graph Convolutional Networks for Relation Extraction , 2019, ACL.

[8]  Danqi Chen,et al.  Position-aware Attention and Supervised Data Improve Slot Filling , 2017, EMNLP.

[9]  Soujanya Poria,et al.  Dialogue Relation Extraction with Document-level Heterogeneous Graph Attention Networks , 2020, ArXiv.

[10]  Marco Cuturi,et al.  Soft-DTW: a Differentiable Loss Function for Time-Series , 2017, ICML.

[11]  Omer Levy,et al.  SpanBERT: Improving Pre-training by Representing and Predicting Spans , 2019, TACL.

[12]  Hao Zhang,et al.  GDPNet: Refining Latent Multi-View Graph for Relation Extraction , 2020, AAAI.

[13]  Wei Lu,et al.  Reasoning with Latent Structure Refinement for Document-Level Relation Extraction , 2020, ACL.

[14]  Alexander Löser,et al.  How Does BERT Answer Questions?: A Layer-Wise Analysis of Transformer Representations , 2019, CIKM.

[15]  Ming-Wei Chang,et al.  BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding , 2019, NAACL.

[16]  Wei Chu,et al.  Symmetric Regularization based BERT for Pair-wise Semantic Reasoning , 2019, SIGIR.

[17]  Richong Zhang,et al.  Relation Extraction with Convolutional Network over Learnable Syntax-Transport Graph , 2020, AAAI.