Rewarding Smatch: Transition-Based AMR Parsing with Reinforcement Learning

Our work involves enriching the Stack-LSTM transition-based AMR parser (Ballesteros and Al-Onaizan, 2017) by augmenting training with Policy Learning and rewarding the Smatch score of sampled graphs. In addition, we also combined several AMR-to-text alignments with an attention mechanism and we supplemented the parser with pre-processed concept identification, named entities and contextualized embeddings. We achieve a highly competitive performance that is comparable to the best published results. We show an in-depth study ablating each of the new components of the parser

[1]  David Vilares,et al.  A Transition-Based Algorithm for Unrestricted AMR Parsing , 2018, NAACL.

[2]  Philipp Koehn,et al.  Abstract Meaning Representation for Sembanking , 2013, LAW@ACL.

[3]  Noah A. Smith,et al.  Training with Exploration Improves a Greedy Stack LSTM Parser , 2016, EMNLP.

[4]  Kevin Duh,et al.  AMR Parsing as Sequence-to-Graph Transduction , 2019, ACL.

[5]  Yaser Al-Onaizan,et al.  AMR Parsing using Stack-LSTMs , 2017, EMNLP.

[6]  Johan Bos,et al.  Neural Semantic Parsing by Character-based Translation: Experiments with Abstract Meaning Representations , 2017, ArXiv.

[7]  Vaibhava Goel,et al.  Self-Critical Sequence Training for Image Captioning , 2016, 2017 IEEE Conference on Computer Vision and Pattern Recognition (CVPR).

[8]  Chuan Wang,et al.  Boosting Transition-based AMR Parsing with Refined Actions and Auxiliary Analyzers , 2015, ACL.

[9]  Martha Palmer,et al.  Unsupervised AMR-Dependency Parse Alignment , 2017, EACL.

[10]  Ronald J. Williams,et al.  Simple Statistical Gradient-Following Algorithms for Connectionist Reinforcement Learning , 2004, Machine Learning.

[11]  Kevin Duh,et al.  DyNet: The Dynamic Neural Network Toolkit , 2017, ArXiv.

[12]  Kevin Knight,et al.  Smatch: an Evaluation Metric for Semantic Feature Structures , 2013, ACL.

[13]  Yuji Matsumoto,et al.  Statistical Dependency Analysis with Support Vector Machines , 2003, IWPT.

[14]  John Langford,et al.  Search-based structured prediction , 2009, Machine Learning.

[15]  Yoshua Bengio,et al.  Neural Machine Translation by Jointly Learning to Align and Translate , 2014, ICLR.

[16]  Hans Uszkoreit,et al.  AMR Parsing with an Incremental Joint Model , 2016, EMNLP.

[17]  Wei-Te Chen Learning to Map Dependency Parses to Abstract Meaning Representations , 2015, ACL.

[18]  Chenhui Chu,et al.  Supervised Syntax-based Alignment between English Sentences and Abstract Meaning Representation Graphs , 2016, ArXiv.

[19]  Jaime G. Carbonell,et al.  CMU at SemEval-2016 Task 8: Graph-based AMR Parsing with Infinite Ramp Loss , 2016, *SEMEVAL.

[20]  Gourab Kundu,et al.  Neural Cross-Lingual Entity Linking , 2017, AAAI.

[21]  James H. Martin,et al.  Abstract Meaning Representation Parsing using LSTM Recurrent Neural Networks , 2017, ACL.

[22]  Giorgio Satta,et al.  An Incremental Parser for Abstract Meaning Representation , 2016, EACL.

[23]  Jian Ni,et al.  Weakly Supervised Cross-Lingual Named Entity Recognition via Effective Annotation and Representation Projection , 2017, ACL.

[24]  Joakim Nivre,et al.  An Efficient Algorithm for Projective Dependency Parsing , 2003, IWPT.

[25]  Nathan Schneider,et al.  A Structured Syntax-Semantics Interface for English-AMR Alignment , 2018, NAACL-HLT.

[26]  Joakim Nivre,et al.  Algorithms for Deterministic Incremental Dependency Parsing , 2008, CL.

[27]  Yang Gao,et al.  Aligning English Strings with Abstract Meaning Representation Graphs , 2014, EMNLP.

[28]  Noah A. Smith,et al.  Transition-Based Dependency Parsing with Stack Long Short-Term Memory , 2015, ACL.

[29]  Andreas Vlachos,et al.  Noise reduction and targeted exploration in imitation learning for Abstract Meaning Representation parsing , 2016, ACL.

[30]  Yijia Liu,et al.  An AMR Aligner Tuned by Transition-based Parser , 2018, EMNLP.

[31]  Joakim Nivre,et al.  A Dynamic Oracle for Arc-Eager Dependency Parsing , 2012, COLING.

[32]  Christopher D. Manning,et al.  Effective Approaches to Attention-based Neural Machine Translation , 2015, EMNLP.

[33]  Richard Socher,et al.  A Deep Reinforced Model for Abstractive Summarization , 2017, ICLR.

[34]  Wei Lu,et al.  Better Transition-Based AMR Parsing with a Refined Search Space , 2018, EMNLP.

[35]  Ivan Titov,et al.  AMR Parsing as Graph Prediction with Latent Alignment , 2018, ACL.

[36]  Daniel Marcu,et al.  Learning as search optimization: approximate large margin methods for structured prediction , 2005, ICML.

[37]  Wang Ling,et al.  Two/Too Simple Adaptations of Word2Vec for Syntax Problems , 2015, NAACL.

[38]  Luke S. Zettlemoyer,et al.  Deep Contextualized Word Representations , 2018, NAACL.

[39]  Jaime G. Carbonell,et al.  A Discriminative Graph-Based Parser for the Abstract Meaning Representation , 2014, ACL.

[40]  Noah A. Smith,et al.  Improved Transition-based Parsing by Modeling Characters instead of Words with LSTMs , 2015, EMNLP.

[41]  Jonathan May SemEval-2016 Task 8: Meaning Representation Parsing , 2016, SemEval@NAACL-HLT.

[42]  Chuan Wang,et al.  Getting the Most out of AMR Parsing , 2017, EMNLP.

[43]  Joakim Nivre,et al.  Training Deterministic Parsers with Non-Deterministic Oracles , 2013, TACL.

[44]  Chuan Wang,et al.  A Transition-based Algorithm for AMR Parsing , 2015, NAACL.

[45]  Ming-Wei Chang,et al.  BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding , 2019, NAACL.