Red Dragon AI at TextGraphs 2019 Shared Task: Language Model Assisted Explanation Generation
暂无分享,去创建一个
[1] Ming-Wei Chang,et al. BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding , 2019, NAACL.
[2] Clayton T. Morrison,et al. WorldTree: A Corpus of Explanation Graphs for Elementary Science Questions supporting Multi-hop Inference , 2018, LREC.
[3] Oren Etzioni,et al. Think you have Solved Question Answering? Try ARC, the AI2 Reasoning Challenge , 2018, ArXiv.
[4] Peter Jansen,et al. Controlling Information Aggregation for Complex Question Answering , 2018, ECIR.
[5] Omer Levy,et al. RoBERTa: A Robustly Optimized BERT Pretraining Approach , 2019, ArXiv.
[6] Peter Jansen,et al. Multi-class Hierarchical Question Classification for Multiple Choice Science Exams , 2019, LREC.
[7] Peter Jansen,et al. Multi-hop Inference for Sentence-level TextGraphs: How Challenging is Meaningfully Combining Information for Science Question Answering? , 2018, TextGraphs@NAACL-HLT.
[8] Dmitry Ustalov,et al. TextGraphs 2019 Shared Task on Multi-Hop Inference for Explanation Regeneration , 2019, EMNLP.