Relation Extraction via Attention-Based CNNs using Token-Level Representations

Relation extraction is an important task in the field of natural language processing (NLP). Most existing methods usually utilize word-level representations, ignoring massive information from the texts. To address this issue, we utilize BERT to pretrain token-level bidirectional contextual representations from raw sentences, then employ a Transformer encoder to train token-level representations mentioned above again. To capture the most important information of the sentences, we employ a convolutional neural network (CNN) with entity-aware attention to extract high-level features from these token-level representations of the sentences. On the SemEval-2010 Task 8 dataset, our model achieves 87.8% F1-score and outperforms several previous models without rich prior knowledge.

[1]  Yifan He,et al.  Enriching Pre-trained Language Model with Entity Information for Relation Classification , 2019, CIKM.

[2]  Xuanjing Huang,et al.  Attention-Based Convolutional Neural Network for Semantic Relation Extraction , 2016, COLING.

[3]  Yong Suk Choi,et al.  Semantic Relation Classification via Bidirectional LSTM Networks with Entity-aware Attention using Latent Entity Typing , 2019, Symmetry.

[4]  Ming-Wei Chang,et al.  BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding , 2019, NAACL.

[5]  Bowen Zhou,et al.  Classifying Relations by Ranking with Convolutional Neural Networks , 2015, ACL.

[6]  Matthew D. Zeiler ADADELTA: An Adaptive Learning Rate Method , 2012, ArXiv.

[7]  Andrew Y. Ng,et al.  Semantic Compositionality through Recursive Matrix-Vector Spaces , 2012, EMNLP.

[8]  Ming Yang,et al.  Bidirectional Long Short-Term Memory Networks for Relation Classification , 2015, PACLIC.

[9]  Luke S. Zettlemoyer,et al.  Knowledge-Based Weak Supervision for Information Extraction of Overlapping Relations , 2011, ACL.

[10]  Zhi Jin,et al.  Classifying Relations via Long Short Term Memory Networks along Shortest Dependency Paths , 2015, EMNLP.

[11]  Jun Zhao,et al.  Relation Classification via Convolutional Deep Neural Network , 2014, COLING.

[12]  Cong Liu,et al.  Semantic Relation Classification via Hierarchical Recurrent Neural Network with Attention , 2016, COLING.

[13]  Nitish Srivastava,et al.  Improving neural networks by preventing co-adaptation of feature detectors , 2012, ArXiv.

[14]  Wei Shi,et al.  Attention-Based Bidirectional Long Short-Term Memory Networks for Relation Classification , 2016, ACL.

[15]  Ramesh Nallapati,et al.  Multi-instance Multi-label Learning for Relation Extraction , 2012, EMNLP.

[16]  Lukasz Kaiser,et al.  Attention is All you Need , 2017, NIPS.

[17]  Gerhard Weikum,et al.  Combining linguistic and statistical analysis to extract relations from web documents , 2006, KDD '06.

[18]  George Kurian,et al.  Google's Neural Machine Translation System: Bridging the Gap between Human and Machine Translation , 2016, ArXiv.

[19]  Zhi Jin,et al.  Improved relation classification by deep recurrent neural networks with data augmentation , 2016, COLING.

[20]  Nanda Kambhatla,et al.  Combining Lexical, Syntactic, and Semantic Features with Maximum Entropy Models for Information Extraction , 2004, ACL.

[21]  Ralph Grishman,et al.  Relation Extraction: Perspective from Convolutional Neural Networks , 2015, VS@HLT-NAACL.

[22]  Jeffrey Pennington,et al.  GloVe: Global Vectors for Word Representation , 2014, EMNLP.

[23]  Luke S. Zettlemoyer,et al.  Deep Contextualized Word Representations , 2018, NAACL.

[24]  Sanda M. Harabagiu,et al.  UTD: Classifying Semantic Relations by Combining Lexical and Semantic Resources , 2010, *SEMEVAL.

[25]  Jeffrey Dean,et al.  Efficient Estimation of Word Representations in Vector Space , 2013, ICLR.