Infobox-to-text Generation with Tree-like Planning based Attention Network

We study the problem of infobox-to-text generation that aims to generate a textual description from a key-value table. Representing the input infobox as a sequence, previous neural methods using end-to-end models without order planning suffer from the problems of incoherence and inadaptability to disordered input. Although recent planningbased models can make some effects, these methods depend on static order-plan to guide generation, which may cause error propagation between planning and generation. To address these issues, we propose a Tree-like PLanning based Attention Network (Tree-PLAN) that leverages both order planning and dynamic tuning to facilitate infobox-totext generation. We first apply a pointer network to obtain a preliminary order-plan of the input. A novel tree-like tuning encoder is then designed to dynamically tune the order-plan by merging the most relevant attributes together layer by layer. Sets of experiments conducted on two datasets show that our model not only outperforms previous methods on both automatic and human evaluation, but also has better adaptability to disordered input.

[1]  Pascal Poupart,et al.  Order-Planning Neural Text Generation From Structured Data , 2017, AAAI.

[2]  Xiaocheng Feng,et al.  Table-to-Text Generation with Effective Hierarchical Encoder on Three Dimensions (Row, Column and Time) , 2019, EMNLP.

[3]  Zhifang Sui,et al.  Towards Comprehensive Description Generation from Factual Attribute-value Tables , 2019, ACL.

[4]  Zhifang Sui,et al.  Hierarchical Encoder with Auxiliary Supervision for Neural Table-to-Text Generation: Learning Better Representation for Tables , 2019, AAAI.

[5]  Xiaojun Wan,et al.  Hierarchical Attention Networks for Sentence Ordering , 2019, AAAI.

[6]  Rui Zhang,et al.  Sentence Generation for Entity Description with Content-Plan Attention , 2020, AAAI.

[7]  Verena Rieser,et al.  The E2E Dataset: New Challenges For End-to-End Generation , 2017, SIGDIAL Conference.

[8]  Zhifang Sui,et al.  Table-to-text Generation by Structure-aware Seq2seq Learning , 2017, AAAI.

[9]  Minlie Huang,et al.  Long and Diverse Text Generation with Planning-based Hierarchical Variational Model , 2019, EMNLP.

[10]  Rong Pan,et al.  Operation-guided Neural Networks for High Fidelity Data-To-Text Generation , 2018, EMNLP.

[11]  Hung-Yi Lee,et al.  Tree Transformer: Integrating Tree Structures into Self-Attention , 2019, EMNLP/IJCNLP.

[12]  Lukasz Kaiser,et al.  Attention is All you Need , 2017, NIPS.

[13]  Alexander M. Rush,et al.  Challenges in Data-to-Document Generation , 2017, EMNLP.

[14]  Matthew R. Walter,et al.  What to talk about and how? Selective Generation using LSTMs with Coarse-to-Fine Alignment , 2015, NAACL.