Learning to Order Graph Elements with Application to Multilingual Surface Realization

Recent advances in deep learning have shown promises in solving complex combinatorial optimization problems, such as sorting variable-sized sequences. In this work, we take a step further and tackle the problem of ordering the elements of sequences that come with graph structures. Our solution adopts an encoder-decoder framework, in which the encoder is a graph neural network that learns the representation for each element, and the decoder predicts the ordering of each local neighborhood of the graph in turn. We apply our framework to multilingual surface realization, which is the task of ordering and completing sentences with their dependency parses given but without the ordering of words. Experiments show that our approach is much better for this task than prior works that do not consider graph structures. We participated in 2019 Surface Realization Shared Task (SR'19) , and we ranked second out of 14 teams while outperforming those teams below by a large margin.

[1]  Samy Bengio,et al.  Neural Combinatorial Optimization with Reinforcement Learning , 2016, ICLR.

[2]  Mirella Lapata,et al.  Single Document Summarization as Tree Induction , 2019, NAACL.

[3]  Leo Wanner,et al.  Underspecified Universal Dependency Structures as Inputs for Multilingual Surface Realisation , 2018, INLG.

[4]  Ming-Wei Chang,et al.  BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding , 2019, NAACL.

[5]  Martin Potthast,et al.  CoNLL 2018 Shared Task: Multilingual Parsing from Raw Text to Universal Dependencies , 2018, CoNLL.

[6]  Jure Leskovec,et al.  GraphRNN: Generating Realistic Graphs with Deep Auto-regressive Models , 2018, ICML.

[7]  Joakim Nivre,et al.  Universal Stanford dependencies: A cross-linguistic typology , 2014, LREC.

[8]  Anja Belz,et al.  The Second Multilingual Surface Realisation Shared Task (SR'19): Overview and Evaluation Results , 2019, MSR@EMNLP-IJCNLP.

[9]  Samy Bengio,et al.  Order Matters: Sequence to sequence for sets , 2015, ICLR.

[10]  Alan W. Black,et al.  Top-Down Structurally-Constrained Neural Response Generation with Lexicalized Probabilistic Context-Free Grammar , 2019, NAACL-HLT.

[11]  Mikhail Belkin,et al.  Laplacian Eigenmaps and Spectral Techniques for Embedding and Clustering , 2001, NIPS.

[12]  Lukasz Kaiser,et al.  Attention is All you Need , 2017, NIPS.

[13]  Graham Neubig,et al.  A Tree-based Decoder for Neural Machine Translation , 2018, EMNLP.

[14]  Navdeep Jaitly,et al.  Pointer Networks , 2015, NIPS.

[15]  Gholamreza Haffari,et al.  Graph-to-Sequence Learning using Gated Graph Neural Networks , 2018, ACL.

[16]  Ping Li,et al.  Graph to Graph: a Topology Aware Approach for Graph Structures Learning and Generation , 2019, AISTATS.

[17]  Quoc V. Le,et al.  Sequence to Sequence Learning with Neural Networks , 2014, NIPS.

[18]  Jason Weston,et al.  End-To-End Memory Networks , 2015, NIPS.

[19]  Pietro Liò,et al.  Graph Attention Networks , 2017, ICLR.

[20]  Yoram Singer,et al.  Learning to Order Things , 1997, NIPS.

[21]  Yoav Goldberg,et al.  Towards String-To-Tree Neural Machine Translation , 2017, ACL.