Semi-supervised Auto-encoder Based Event Detection in Constructing Knowledge Graph for Social Good

Knowledge graphs have recently been extensively applied in many different areas (e.g., disaster management and relief, disease diagnosis). For example, event-centric knowledge graphs have been developed to improve decision making in disaster management and relief. This paper focuses on the task of event detection, which is the precondition of event extraction for constructing event-centric knowledge graphs. Event detection identifies trigger words of events in the sentences of a document and further classifies the types of events. It is straightforward that context information is useful for event detection. Therefore, the feature-based methods adopt crosssentence information. However, they suffer from the complication of human-designed features. On the other hand, the representationbased methods learn document-level embeddings, which, however, contain much noise caused by unsupervised learning. To overcome these problems, in this paper we propose a new model based on Semi-supervised Auto-Encoder, which learns Context information to Enhance Event Detection, thus called SAE-CEED. This model first applies large-scale unlabeled texts to pre-train an auto-encoder, so that the embeddings of segments learned by the encoder contain the semantic and order information of the original text. It then uses the decoder to extract the context embeddings and fine-tunes them to enhance a bidirectional neural network model to identify event triggers and their types in sentences. Through experiments on the benchmark ACE-2005 dataset, we demonstrate the effectiveness of the proposed SAE-CEED model. In addition, we systematically conduct a series of experiments to verify the impact of different lengths of text segments in the pre-training of the auto-encoder on event detection.

[1]  Xiang Zhang,et al.  Automatically Labeled Data Generation for Large Scale Event Extraction , 2017, ACL.

[2]  Xiaoli Z. Fern,et al.  Event Nugget Detection with Forward-Backward Recurrent Neural Networks , 2016, ACL.

[3]  Quoc V. Le,et al.  Sequence to Sequence Learning with Neural Networks , 2014, NIPS.

[4]  Ralph Grishman,et al.  Joint Event Extraction via Recurrent Neural Networks , 2016, NAACL.

[5]  Ruslan Salakhutdinov,et al.  Gated-Attention Readers for Text Comprehension , 2016, ACL.

[6]  Geoffrey E. Hinton,et al.  ImageNet classification with deep convolutional neural networks , 2012, Commun. ACM.

[7]  Kanagasabai Rajaraman,et al.  Towards Next Generation Knowledge Graphs for Disaster Management , 2019, 2019 IEEE 13th International Conference on Semantic Computing (ICSC).

[8]  Jun Zhao,et al.  Leveraging FrameNet to Improve Automatic Event Detection , 2016, ACL.

[9]  David Ahn,et al.  The stages of event extraction , 2006 .

[10]  Ralph Grishman,et al.  Modeling Skip-Grams for Event Detection with Convolutional Neural Networks , 2016, EMNLP.

[11]  Lukás Burget,et al.  Recurrent neural network based language model , 2010, INTERSPEECH.

[12]  Ralph Grishman,et al.  Event Detection and Domain Adaptation with Convolutional Neural Networks , 2015, ACL.

[13]  Hal Daumé Notes on CG and LM-BFGS Optimization of Logistic Regression , 2008 .

[14]  Heng Ji,et al.  Joint Event Extraction via Structured Prediction with Global Features , 2013, ACL.

[15]  Ralph Grishman,et al.  Using Document Level Cross-Event Inference to Improve Event Extraction , 2010, ACL.

[16]  Jun Zhao,et al.  Exploiting Argument Information to Improve Event Detection via Supervised Attention Mechanisms , 2017, ACL.

[17]  Quoc V. Le,et al.  Distributed Representations of Sentences and Documents , 2014, ICML.

[18]  Ellen Riloff,et al.  Modeling Textual Cohesion for Event Extraction , 2012, AAAI.

[19]  Heng Ji,et al.  Refining Event Extraction through Cross-Document Inference , 2008, ACL.

[20]  Jun Zhao,et al.  Event Extraction via Dynamic Multi-Pooling Convolutional Neural Networks , 2015, ACL.

[21]  David Sontag,et al.  Learning a Health Knowledge Graph from Electronic Medical Records , 2017, Scientific Reports.

[22]  Ruifang He,et al.  Exploiting Document Level Information to Improve Event Detection via Recurrent Neural Networks , 2017, IJCNLP.

[23]  Quoc V. Le,et al.  Semi-supervised Sequence Learning , 2015, NIPS.

[24]  Johan A. K. Suykens,et al.  Least Squares Support Vector Machine Classifiers , 1999, Neural Processing Letters.

[25]  Bin Ma,et al.  Using Cross-Entity Inference to Improve Event Extraction , 2011, ACL.