DRTS Parsing with Structure-Aware Encoding and Decoding

Discourse representation tree structure (DRTS) parsing is a novel semantic parsing task which has been concerned most recently. State-of-the-art performance can be achieved by a neural sequence-to-sequence model, treating the tree construction as an incremental sequence generation problem. Structural information such as input syntax and the intermediate skeleton of the partial output has been ignored in the model, which could be potentially useful for the DRTS parsing. In this work, we propose a structural-aware model at both the encoder and decoder phase to integrate the structural information, where graph attention network (GAT) is exploited for effectively modeling. Experimental results on a benchmark dataset show that our proposed model is effective and can obtain the best performance in the literature.

[1]  Nan Yu,et al.  Transition-based Neural RST Parsing with Implicit Syntax Features , 2018, COLING.

[2]  Danqi Chen,et al.  A Fast and Accurate Dependency Parser using Neural Networks , 2014, EMNLP.

[3]  Mirella Lapata,et al.  Discourse Representation Parsing for Sentences and Documents , 2019, ACL.

[4]  Hongyu Guo,et al.  Long Short-Term Memory Over Recursive Structures , 2015, ICML.

[5]  Hai Zhao,et al.  A Unified Syntax-aware Framework for Semantic Role Labeling , 2018, EMNLP.

[6]  Max Welling,et al.  Semi-Supervised Classification with Graph Convolutional Networks , 2016, ICLR.

[7]  Alex Lascarides,et al.  Logics of Conversation , 2005, Studies in natural language processing.

[8]  Yue Zhang,et al.  Sentence-State LSTM for Text Representation , 2018, ACL.

[9]  Salim Roukos,et al.  Bleu: a Method for Automatic Evaluation of Machine Translation , 2002, ACL.

[10]  Johan Bos,et al.  The Groningen Meaning Bank , 2013, JSSP.

[11]  Mirella Lapata,et al.  Modeling Local Coherence: An Entity-Based Approach , 2005, ACL.

[12]  Phong Le,et al.  Learning Compositional Semantics for Open Domain Semantic Parsing , 2012, COLING.

[13]  Pietro Liò,et al.  Graph Attention Networks , 2017, ICLR.

[14]  Linmei Hu,et al.  Heterogeneous Graph Attention Networks for Semi-supervised Short Text Classification , 2019, EMNLP.

[15]  Kewei Tu,et al.  Latent Variable Sentiment Grammar , 2019, ACL.

[16]  Nicholas Asher,et al.  Reference to abstract objects in discourse , 1993, Studies in linguistics and philosophy.

[17]  Scott Weinstein,et al.  Centering: A Framework for Modeling the Local Coherence of Discourse , 1995, CL.

[18]  Graeme Hirst,et al.  A Linear-Time Bottom-Up Discourse Parser with Constraints and Post-Editing , 2014, ACL.

[19]  Mirella Lapata,et al.  Discourse Representation Structure Parsing , 2018, ACL.

[20]  Yoshua Bengio,et al.  Neural Machine Translation by Jointly Learning to Align and Translate , 2014, ICLR.

[21]  Jungo Kasai,et al.  Syntax-aware Neural Semantic Role Labeling with Supertags , 2019, NAACL.

[22]  Tomek Strzalkowski,et al.  From Discourse to Logic , 1991 .

[23]  Kenji Sagae,et al.  Fast Rhetorical Structure Theory Discourse Parsing , 2015, ArXiv.

[24]  Yue Zhang,et al.  Tree Communication Models for Sentiment Analysis , 2019, ACL.

[25]  William C. Mann,et al.  Rhetorical Structure Theory: Toward a functional theory of text organization , 1988 .

[26]  H. Kamp A Theory of Truth and Semantic Representation , 2008 .

[27]  Yue Zhang,et al.  A Graph-to-Sequence Model for AMR-to-Text Generation , 2018, ACL.

[28]  Christopher D. Manning,et al.  Improved Semantic Representations From Tree-Structured Long Short-Term Memory Networks , 2015, ACL.

[29]  Antonio Toral,et al.  Exploring Neural Methods for Parsing Discourse Representation Structures , 2018, TACL.

[30]  Yue Zhang,et al.  Bidirectional Tree-Structured LSTM with Head Lexicalization , 2016, ArXiv.

[31]  Johan Bos,et al.  Evaluating Scoped Meaning Representations , 2018, LREC.

[32]  Khalil Sima'an,et al.  Graph Convolutional Encoders for Syntax-aware Neural Machine Translation , 2017, EMNLP.

[33]  Priyadarshini Panda,et al.  Tree-CNN: A hierarchical Deep Convolutional Neural Network for incremental learning , 2018, Neural Networks.

[34]  Lukasz Kaiser,et al.  Attention is All you Need , 2017, NIPS.

[35]  Daniel Jurafsky,et al.  A Hierarchical Neural Autoencoder for Paragraphs and Documents , 2015, ACL.

[36]  Graham Neubig,et al.  StructVAE: Tree-structured Latent Variable Models for Semi-supervised Semantic Parsing , 2018, ACL.

[37]  Kathleen M. Carley,et al.  Syntax-Aware Aspect Level Sentiment Classification with Graph Attention Networks , 2019, EMNLP.