Spanish Pre-Trained Language Models for HealthCare Industry
暂无分享,去创建一个
[1] Jimmy Ba,et al. Adam: A Method for Stochastic Optimization , 2014, ICLR.
[2] Jörg Tiedemann,et al. Parallel Data, Tools and Interfaces in OPUS , 2012, LREC.
[3] Iryna Gurevych,et al. AdapterDrop: On the Efficiency of Adapters in Transformers , 2020, EMNLP.
[4] Yiming Yang,et al. XLNet: Generalized Autoregressive Pretraining for Language Understanding , 2019, NeurIPS.
[5] Ayush Kaushal,et al. IITKGP at W-NUT 2020 Shared Task-1: Domain specific BERT representation for Named Entity Recognition of lab protocol , 2020, W-NUT@EMNLP.
[6] Natalia Gimelshein,et al. PyTorch: An Imperative Style, High-Performance Deep Learning Library , 2019, NeurIPS.
[7] Andrew McCallum,et al. Conditional Random Fields: Probabilistic Models for Segmenting and Labeling Sequence Data , 2001, ICML.
[8] Ming-Wei Chang,et al. BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding , 2019, NAACL.
[9] Johan Bos,et al. Proceedings of the Demonstrations at the 13th Conference of the European Chapter of the Association for Computational Linguistics , 2012 .
[10] Thomas Wolf,et al. HuggingFace's Transformers: State-of-the-art Natural Language Processing , 2019, ArXiv.
[11] Jürgen Schmidhuber,et al. Long Short-Term Memory , 1997, Neural Computation.
[12] Martin Krallinger,et al. NLP applied to occupational health: MEDDOPROF shared task at IberLEF 2021 on automatic recognition, classification and normalization of professions and occupations from medical texts , 2021, Proces. del Leng. Natural.
[13] Sampo Pyysalo,et al. brat: a Web-based Tool for NLP-Assisted Text Annotation , 2012, EACL.