MRE: A Military Relation Extraction Model Based on BiGRU and Multi-Head Attention

A great deal of operational information exists in the form of text. Therefore, extracting operational information from unstructured military text is of great significance for assisting command decision making and operations. Military relation extraction is one of the main tasks of military information extraction, which aims at identifying the relation between two named entities from unstructured military texts. However, the traditional methods of extracting military relations cannot easily resolve problems such as inadequate manual features and inaccurate Chinese word segmentation in military fields, failing to make full use of symmetrical entity relations in military texts. With our approach, based on the pre-trained language model, we present a Chinese military relation extraction method, which combines the bi-directional gate recurrent unit (BiGRU) and multi-head attention mechanism (MHATT). More specifically, the conceptual foundation of our method lies in constructing an embedding layer and combining word embedding with position embedding, based on the pre-trained language model; the output vectors of BiGRU neural networks are symmetrically spliced to learn the semantic features of context, and they fuse the multi-head attention mechanism to improve the ability of expressing semantic information. On the military text corpus that we have built, we conduct extensive experiments. We demonstrate the superiority of our method over the traditional non-attention model, attention model, and improved attention model, and the comprehensive evaluation value F1-score of the model is improved by about 4%.

[1]  Xiong Luo,et al.  Attention-Based Relation Extraction With Bidirectional Gated Recurrent Unit and Highway Network in the Analysis of Geological Data , 2018, IEEE Access.

[2]  Jürgen Schmidhuber,et al.  Long Short-Term Memory , 1997, Neural Computation.

[3]  Jeffrey Dean,et al.  Distributed Representations of Words and Phrases and their Compositionality , 2013, NIPS.

[4]  Huifang Ma,et al.  Improve relation extraction with dual attention-guided graph convolutional networks , 2020, Neural Computing and Applications.

[5]  Che Wan Automatic Entity Relation Extraction , 2005 .

[6]  Ming-Wei Chang,et al.  BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding , 2019, NAACL.

[7]  Dong Wang,et al.  Relation Classification via Recurrent Neural Network , 2015, ArXiv.

[8]  Xingxin Li,et al.  Application of Entity Relation Extraction Method Under CRF and Syntax Analysis Tree in the Construction of Military Equipment Knowledge Graph , 2020, IEEE Access.

[9]  Chengyu Wang,et al.  Open Relation Extraction for Chinese Noun Phrases , 2019 .

[10]  Shixian Ning,et al.  Combining Context and Knowledge Representations for Chemical-Disease Relation Extraction , 2019, IEEE/ACM Transactions on Computational Biology and Bioinformatics.

[11]  Zhi Jin,et al.  Classifying Relations via Long Short Term Memory Networks along Shortest Dependency Paths , 2015, EMNLP.

[12]  Hyun Kwon Friend-Guard Textfooler Attack on Text Classification System , 2021 .

[13]  Dmitry Zelenko,et al.  Kernel Methods for Relation Extraction , 2002, J. Mach. Learn. Res..

[14]  Nirmalya Thakur,et al.  An Ambient Intelligence-Based Human Behavior Monitoring Framework for Ubiquitous Environments , 2021, Inf..

[15]  Kun Ding,et al.  A Knowledge-Enriched and Span-Based Network for Joint Entity and Relation Extraction , 2021 .

[16]  Nirmalya Thakur,et al.  Multimodal Approaches for Indoor Localization for Ambient Assisted Living in Smart Homes , 2021, Inf..

[17]  Hua Xu,et al.  Representation iterative fusion based on heterogeneous graph neural network for joint entity and relation extraction , 2021, Knowl. Based Syst..

[18]  Zhiyuan Liu,et al.  FewRel: A Large-Scale Supervised Few-Shot Relation Classification Dataset with State-of-the-Art Evaluation , 2018, EMNLP.