Concept Identification with Sequence-to-Sequence Models in Abstract Meaning Representation Parsing

A variety of NLP tasks require semantic information from sentence input. In order to capture meaning through Abstract Meaning Representation parsing, concept identification is required. In this research, the task of concept identification is approached using sequence-to-sequence models. Building upon an encoder-decoder architecture with attention mechanism, concepts are separated into verbs and non-verbs. Different alignment strategies are applied in order to increase the amount of train and test data for the model, resulting in a merged data-set with twice as many available instances. Furthermore, the training data increase induces an improvement of 10% for the sequence-to-sequence model.

[1]  Jürgen Schmidhuber,et al.  Long Short-Term Memory , 1997, Neural Computation.

[2]  Hans Uszkoreit,et al.  AMR Parsing with an Incremental Joint Model , 2016, EMNLP.

[3]  Christopher D. Manning,et al.  Incorporating Non-local Information into Information Extraction Systems by Gibbs Sampling , 2005, ACL.

[4]  George A. Miller,et al.  WordNet: A Lexical Database for English , 1995, HLT.

[5]  Chuan Wang,et al.  A Transition-based Algorithm for AMR Parsing , 2015, NAACL.

[6]  Yoshua Bengio,et al.  Neural Machine Translation by Jointly Learning to Align and Translate , 2014, ICLR.

[7]  Hui Wan,et al.  Rewarding Smatch: Transition-Based AMR Parsing with Reinforcement Learning , 2019, ACL.

[8]  Jaime G. Carbonell,et al.  A Discriminative Graph-Based Parser for the Abstract Meaning Representation , 2014, ACL.

[9]  Yang Gao,et al.  Aligning English Strings with Abstract Meaning Representation Graphs , 2014, EMNLP.

[10]  Martha Palmer,et al.  From TreeBank to PropBank , 2002, LREC.

[11]  Johan Bos,et al.  Neural Semantic Parsing by Character-based Translation: Experiments with Abstract Meaning Representations , 2017, ArXiv.

[12]  Bowen Zhou,et al.  Abstractive Text Summarization using Sequence-to-sequence RNNs and Beyond , 2016, CoNLL.

[13]  Kevin Duh,et al.  AMR Parsing as Sequence-to-Graph Transduction , 2019, ACL.

[14]  Yoshua Bengio,et al.  Learning Phrase Representations using RNN Encoder–Decoder for Statistical Machine Translation , 2014, EMNLP.

[15]  Ivan Titov,et al.  AMR Parsing as Graph Prediction with Latent Alignment , 2018, ACL.

[16]  Giorgio Satta,et al.  An Incremental Parser for Abstract Meaning Representation , 2016, EACL.

[17]  Rodica Potolea,et al.  Enhancements on a Transition-Based Approach for AMR Parsing Using LSTM Networks , 2018, 2018 IEEE 14th International Conference on Intelligent Computer Communication and Processing (ICCP).

[18]  Salim Roukos,et al.  Bleu: a Method for Automatic Evaluation of Machine Translation , 2002, ACL.

[19]  Philipp Koehn,et al.  Abstract Meaning Representation for Sembanking , 2013, LAW@ACL.

[20]  Yaser Al-Onaizan,et al.  AMR Parsing using Stack-LSTMs , 2017, EMNLP.

[21]  Robert L. Mercer,et al.  The Mathematics of Statistical Machine Translation: Parameter Estimation , 1993, CL.

[22]  Quoc V. Le,et al.  Sequence to Sequence Learning with Neural Networks , 2014, NIPS.