NCUEE at MEDIQA 2019: Medical Text Inference Using Ensemble BERT-BiLSTM-Attention Model
暂无分享,去创建一个
Lung-Hao Lee | Po-Lei Lee | Kuo-Kai Shyu | Po-Han Chen | Yi Lu | Po-Lei Lee | K. Shyu | Yi Lu | Lung-Hao Lee | Po-Han Chen
[1] Asma Ben Abacha,et al. Overview of the MEDIQA 2019 Shared Task on Textual Inference, Question Entailment and Question Answering , 2019, BioNLP@ACL.
[2] Alexey Romanov,et al. Lessons from Natural Language Inference in the Clinical Domain , 2018, EMNLP.
[3] Christopher Potts,et al. A large annotated corpus for learning natural language inference , 2015, EMNLP.
[4] Ming-Wei Chang,et al. BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding , 2019, NAACL.
[5] Geoffrey E. Hinton,et al. Speech recognition with deep recurrent neural networks , 2013, 2013 IEEE International Conference on Acoustics, Speech and Signal Processing.
[6] Holger Schwenk,et al. Supervised Learning of Universal Sentence Representations from Natural Language Inference Data , 2017, EMNLP.
[7] Diyi Yang,et al. Hierarchical Attention Networks for Document Classification , 2016, NAACL.
[8] Zhen-Hua Ling,et al. Enhanced LSTM for Natural Language Inference , 2016, ACL.