暂无分享,去创建一个
Lidong Bing | Yan Zhang | Shay B. Cohen | Wei Lu | Zuozhu Liu | Zhiyang Teng | Zhijiang Guo | Lidong Bing | Wei Lu | Zuozhu Liu | Yan Zhang | Zhijiang Guo | Zhiyang Teng
[1] Richard S. Zemel,et al. Gated Graph Sequence Neural Networks , 2015, ICLR.
[2] Jonathan Berant,et al. Global Reasoning over Database Structures for Text-to-SQL Parsing , 2019, EMNLP.
[3] Wei Lu,et al. Attention Guided Graph Convolutional Networks for Relation Extraction , 2019, ACL.
[4] Doina Precup,et al. Break the Ceiling: Stronger Multi-scale Deep Graph Convolutional Networks , 2019, NeurIPS.
[5] Alon Lavie,et al. Meteor Universal: Language Specific Translation Evaluation for Any Target Language , 2014, WMT@ACL.
[6] Xiaojun Wan,et al. AMR-To-Text Generation with Graph Transformer , 2020, TACL.
[7] Vladlen Koltun,et al. Deep Equilibrium Models , 2019, NeurIPS.
[8] Yue Zhang,et al. AMR-to-text Generation with Synchronous Node Replacement Grammar , 2017, ACL.
[9] Yann Dauphin,et al. Pay Less Attention with Lightweight and Dynamic Convolutions , 2019, ICLR.
[10] Jürgen Schmidhuber,et al. Long Short-Term Memory , 1997, Neural Computation.
[11] Lukasz Kaiser,et al. Attention is All you Need , 2017, NIPS.
[12] Yann Dauphin,et al. Language Modeling with Gated Convolutional Networks , 2016, ICML.
[13] Kilian Q. Weinberger,et al. Densely Connected Convolutional Networks , 2016, 2017 IEEE Conference on Computer Vision and Pattern Recognition (CVPR).
[14] Kristina Lerman,et al. MixHop: Higher-Order Graph Convolutional Architectures via Sparsified Neighborhood Mixing , 2019, ICML.
[15] Salim Roukos,et al. Bleu: a Method for Automatic Evaluation of Machine Translation , 2002, ACL.
[16] Maja Popovic,et al. chrF++: words helping character n-grams , 2017, WMT.
[17] Zhuowen Tu,et al. Aggregated Residual Transformations for Deep Neural Networks , 2016, 2017 IEEE Conference on Computer Vision and Pattern Recognition (CVPR).
[18] Jaime G. Carbonell,et al. Generation from Abstract Meaning Representation using Tree Transducers , 2016, NAACL.
[19] Xiang Ren,et al. KagNet: Knowledge-Aware Graph Networks for Commonsense Reasoning , 2019, EMNLP.
[20] Kevin Knight,et al. Generating English from Abstract Meaning Representations , 2016, INLG.
[21] Jian Yang,et al. Selective Kernel Networks , 2019, 2019 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR).
[22] Joonseok Lee,et al. N-GCN: Multi-scale Graph Convolution for Semi-supervised Node Classification , 2018, UAI.
[23] Bo Chen,et al. MobileNets: Efficient Convolutional Neural Networks for Mobile Vision Applications , 2017, ArXiv.
[24] Shay B. Cohen,et al. Structural Neural Encoders for AMR-to-text Generation , 2019, NAACL.
[25] Jonathan Berant,et al. Representing Schema Structure with Graph Neural Networks for Text-to-SQL Parsing , 2019, ACL.
[26] Rico Sennrich,et al. Neural Machine Translation of Rare Words with Subword Units , 2015, ACL.
[27] Bernard Ghanem,et al. Can GCNs Go as Deep as CNNs? , 2019, ArXiv.
[28] Philipp Koehn,et al. Statistical Significance Tests for Machine Translation Evaluation , 2004, EMNLP.
[29] Iryna Gurevych,et al. Enhancing AMR-to-Text Generation with Dual Graph Representations , 2019, EMNLP.
[30] Ken-ichi Kawarabayashi,et al. Representation Learning on Graphs with Jumping Knowledge Networks , 2018, ICML.
[31] Deng Cai,et al. Graph Transformer for Graph-to-Sequence Learning , 2019, AAAI.
[32] Guodong Zhou,et al. Modeling Graph Structure in Transformer for Better AMR-to-Text Generation , 2019, EMNLP.
[33] Wei Lu,et al. Learning Latent Forests for Medical Relation Extraction , 2020, IJCAI.
[34] Philipp Koehn,et al. Abstract Meaning Representation for Sembanking , 2013, LAW@ACL.
[35] Mark Steedman,et al. The Role of Reentrancies in Abstract Meaning Representation Parsing , 2020, EMNLP.
[36] Stephen Clark,et al. Factorising AMR generation through syntax , 2019, NAACL-HLT.
[37] Xiangyu Zhang,et al. ShuffleNet: An Extremely Efficient Convolutional Neural Network for Mobile Devices , 2017, 2018 IEEE/CVF Conference on Computer Vision and Pattern Recognition.
[38] Lukasz Kaiser,et al. Universal Transformers , 2018, ICLR.
[39] Martin Grohe,et al. Weisfeiler and Leman Go Neural: Higher-order Graph Neural Networks , 2018, AAAI.
[40] Yejin Choi,et al. Neural AMR: Sequence-to-Sequence Models for Parsing and Generation , 2017, ACL.
[41] Vladlen Koltun,et al. Trellis Networks for Sequence Modeling , 2018, ICLR.
[42] Nicola De Cao,et al. Question Answering by Reasoning Across Documents with Graph Convolutional Networks , 2018, NAACL.
[43] Gholamreza Haffari,et al. Graph-to-Sequence Learning using Gated Graph Neural Networks , 2018, ACL.
[44] Matt Post,et al. We start by defining the recurrent architecture as implemented in S OCKEYE , following , 2018 .
[45] Wei Lu,et al. Densely Connected Graph Convolutional Networks for Graph-to-Sequence Learning , 2019, TACL.
[46] Kevin Gimpel,et al. ALBERT: A Lite BERT for Self-supervised Learning of Language Representations , 2019, ICLR.
[47] Zheng Zhang,et al. MXNet: A Flexible and Efficient Machine Learning Library for Heterogeneous Distributed Systems , 2015, ArXiv.
[48] Yue Zhang,et al. A Graph-to-Sequence Model for AMR-to-Text Generation , 2018, ACL.