NLP-IIS@UT at SemEval-2021 Task 4: Machine Reading Comprehension using the Long Document Transformer
暂无分享,去创建一个
Azadeh Shakery | Heshaam Faili | Sajad Movahedi | Ali Ebrahimi | Hossein Basafa | A. Shakery | H. Faili | A. Ebrahimi | Sajad Movahedi | Hossein Basafa
[1] Quoc V. Le,et al. Distributed Representations of Sentences and Documents , 2014, ICML.
[2] Sebastian Riedel,et al. Constructing Datasets for Multi-hop Reading Comprehension Across Documents , 2017, TACL.
[3] Hossein Amirkhani,et al. A Survey on Machine Reading Comprehension Systems , 2020, Natural Language Engineering.
[4] Jian Zhang,et al. SQuAD: 100,000+ Questions for Machine Comprehension of Text , 2016, EMNLP.
[5] Jeffrey Pennington,et al. GloVe: Global Vectors for Word Representation , 2014, EMNLP.
[6] Mark A. Changizi. Economically organized hierarchies in WordNet and the Oxford English Dictionary , 2008, Cognitive Systems Research.
[7] Sebastian Ruder,et al. Universal Language Model Fine-tuning for Text Classification , 2018, ACL.
[8] Luke S. Zettlemoyer,et al. Deep Contextualized Word Representations , 2018, NAACL.
[9] Chenguang Zhu,et al. SDNet: Contextualized Attention-based Deep Network for Conversational Question Answering , 2018, ArXiv.
[10] Zhiyuan Liu,et al. Topical Word Embeddings , 2015, AAAI.
[11] Ming-Wei Chang,et al. BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding , 2019, NAACL.
[12] Quan Liu,et al. PINGAN Omini-Sinitic at SemEval-2021 Task 4:Reading Comprehension of Abstract Meaning , 2021, SEMEVAL.
[13] Agnieszka Mykowiecka,et al. Natural-Language Generation - An Overview , 1991, Int. J. Man Mach. Stud..
[14] Jonathan Pilault,et al. Conditionally Adaptive Multi-Task Learning: Improving Transfer Learning in NLP Using Fewer Parameters & Less Data , 2020, ArXiv.
[15] Jeffrey Dean,et al. Distributed Representations of Words and Phrases and their Compositionality , 2013, NIPS.
[16] Arman Cohan,et al. Longformer: The Long-Document Transformer , 2020, ArXiv.