暂无分享,去创建一个
Yunhai Tong | Jing Yu | Yaming Yang | Jiangang Bai | Yujing Wang | Yiren Chen | Jing Bai | J. Yu | Yunhai Tong | Yujing Wang | Jing Bai | Yiren Chen | Yaming Yang | Jiangang Bai
[1] Jeffrey Pennington,et al. GloVe: Global Vectors for Word Representation , 2014, EMNLP.
[2] Ilya Sutskever,et al. Generating Long Sequences with Sparse Transformers , 2019, ArXiv.
[3] Lukasz Kaiser,et al. Attention is All you Need , 2017, NIPS.
[4] Omer Levy,et al. GLUE: A Multi-Task Benchmark and Analysis Platform for Natural Language Understanding , 2018, BlackboxNLP@EMNLP.
[5] Christopher Potts,et al. A large annotated corpus for learning natural language inference , 2015, EMNLP.
[6] Danqi Chen,et al. A Fast and Accurate Dependency Parser using Neural Networks , 2014, EMNLP.
[7] Aaron C. Courville,et al. Ordered Neurons: Integrating Tree Structures into Recurrent Neural Networks , 2018, ICLR.
[8] Peng Qi,et al. Do Syntax Trees Help Pre-trained Transformers Extract Information? , 2020, ArXiv.
[9] Zhi Jin,et al. Classifying Relations via Long Short Term Memory Networks along Shortest Dependency Paths , 2015, EMNLP.
[10] Luo Si,et al. Supervised Treebank Conversion: Data and Approaches , 2018, ACL.
[11] Christopher D. Manning,et al. Improved Semantic Representations From Tree-Structured Long Short-Term Memory Networks , 2015, ACL.
[12] Benoît Sagot,et al. What Does BERT Learn about the Structure of Language? , 2019, ACL.
[13] Aaron C. Courville,et al. Neural Language Modeling by Jointly Learning Syntax and Lexicon , 2017, ICLR.
[14] Ming-Wei Chang,et al. BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding , 2019, NAACL.
[15] An Anatomy of Graph Neural Networks Going Deep via the Lens of Mutual Information: Exponential Decay vs. Full Preservation , 2019, ArXiv.
[16] Eliyahu Kiperwasser,et al. Scheduled Multi-Task Learning: From Syntax to Translation , 2018, TACL.
[17] Colin Raffel,et al. Exploring the Limits of Transfer Learning with a Unified Text-to-Text Transformer , 2019, J. Mach. Learn. Res..
[18] Christopher D. Manning,et al. A Structural Probe for Finding Syntax in Word Representations , 2019, NAACL.
[19] Dan Klein,et al. Accurate Unlexicalized Parsing , 2003, ACL.
[20] Christopher Potts,et al. Recursive Deep Models for Semantic Compositionality Over a Sentiment Treebank , 2013, EMNLP.
[21] Christopher D. Manning,et al. Generating Typed Dependency Parses from Phrase Structure Parses , 2006, LREC.
[22] Yunhai Tong,et al. A Position Encoding Convolutional Neural Network Based on Dependency Tree for Relation Classification , 2016, EMNLP.
[23] Guoyin Wang,et al. Syntax-Infused Transformer and BERT models for Machine Translation and Natural Language Understanding , 2019, ArXiv.
[24] Samuel R. Bowman,et al. A Broad-Coverage Challenge Corpus for Sentence Understanding through Inference , 2017, NAACL.
[25] Shafiq R. Joty,et al. Tree-structured Attention with Hierarchical Accumulation , 2020, ICLR.
[26] Omer Levy,et al. RoBERTa: A Robustly Optimized BERT Pretraining Approach , 2019, ArXiv.