Infobox-to-text Generation with Tree-like Planning based Attention Network
暂无分享,去创建一个
Ning Ding | Ying Shen | Hai-Tao Zheng | Yang Bai | Ziran Li
[1] Pascal Poupart,et al. Order-Planning Neural Text Generation From Structured Data , 2017, AAAI.
[2] Xiaocheng Feng,et al. Table-to-Text Generation with Effective Hierarchical Encoder on Three Dimensions (Row, Column and Time) , 2019, EMNLP.
[3] Zhifang Sui,et al. Towards Comprehensive Description Generation from Factual Attribute-value Tables , 2019, ACL.
[4] Zhifang Sui,et al. Hierarchical Encoder with Auxiliary Supervision for Neural Table-to-Text Generation: Learning Better Representation for Tables , 2019, AAAI.
[5] Xiaojun Wan,et al. Hierarchical Attention Networks for Sentence Ordering , 2019, AAAI.
[6] Rui Zhang,et al. Sentence Generation for Entity Description with Content-Plan Attention , 2020, AAAI.
[7] Verena Rieser,et al. The E2E Dataset: New Challenges For End-to-End Generation , 2017, SIGDIAL Conference.
[8] Zhifang Sui,et al. Table-to-text Generation by Structure-aware Seq2seq Learning , 2017, AAAI.
[9] Minlie Huang,et al. Long and Diverse Text Generation with Planning-based Hierarchical Variational Model , 2019, EMNLP.
[10] Rong Pan,et al. Operation-guided Neural Networks for High Fidelity Data-To-Text Generation , 2018, EMNLP.
[11] Hung-Yi Lee,et al. Tree Transformer: Integrating Tree Structures into Self-Attention , 2019, EMNLP/IJCNLP.
[12] Lukasz Kaiser,et al. Attention is All you Need , 2017, NIPS.
[13] Alexander M. Rush,et al. Challenges in Data-to-Document Generation , 2017, EMNLP.
[14] Matthew R. Walter,et al. What to talk about and how? Selective Generation using LSTMs with Coarse-to-Fine Alignment , 2015, NAACL.