Auto-tagging of Short Conversational Sentences using Transformer Methods
暂无分享,去创建一个
D. Emre Tasar | Sükrü Ozan | Umut Özdil | M. Fatih Akca | Oguzhan Ölmez | Semih Gülüm | Seçilay Kutal | Ceren Belhan | M. Akca | Seçilay Kutal | S. Ozan | Ceren Belhan | D. E. Tasar | Umut Özdil | Oguzhan Ölmez | Semih Gülüm
[1] David M. W. Powers,et al. Evaluation: from precision, recall and F-measure to ROC, informedness, markedness and correlation , 2011, ArXiv.
[2] Xuanjing Huang,et al. How to Fine-Tune BERT for Text Classification? , 2019, CCL.
[3] Akin Özçift,et al. Advancing natural language processing (NLP) applications of morphologically rich languages with bidirectional encoder representations from transformers (BERT): an empirical case study for Turkish , 2021 .
[4] Dewayne Whitfield. Using GPT-2 to Create Synthetic Data to Improve the Prediction Performance of NLP Machine Learning Classification Models , 2021, ArXiv.
[5] Rui Cao,et al. Document Classification by Word Embeddings of BERT , 2019, PACLING.
[6] Ming-Wei Chang,et al. BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding , 2019, NAACL.
[7] Camille Pradel,et al. Load What You Need: Smaller Versions of Mutlilingual BERT , 2020, SUSTAINLP.
[8] Tapio Salakoski,et al. Is Multilingual BERT Fluent in Language Generation? , 2019, ArXiv.
[9] Thomas Wolf,et al. DistilBERT, a distilled version of BERT: smaller, faster, cheaper and lighter , 2019, ArXiv.
[10] Shashi Pal Singh,et al. Building a Machine Learning Model for Unstructured Text Classification: Towards Hybrid Approach , 2021 .
[11] Ilya Sutskever,et al. Language Models are Unsupervised Multitask Learners , 2019 .