COVID-19 Fake News Detection Using Bidirectional Encoder Representations from Transformers Based Models

Nowadays, the development of social media allows people to access the latest news easily. During the COVID-19 pandemic, it is important for people to access the news so that they can take corresponding protective measures. However, the fake news is flooding and is a serious issue especially under the global pandemic. The misleading fake news can cause significant loss in terms of the individuals and the society. COVID-19 fake news detection has become a novel and important task in the NLP field. However, fake news always contain the correct portion and the incorrect portion. This fact increases the difficulty of the classification task. In this paper, we fine tune the pre-trained Bidirectional Encoder Representations from Transformers (BERT) model as our base model. We add BiLSTM layers and CNN layers on the top of the finetuned BERT model with frozen parameters or not frozen parameters methods respectively. The model performance evaluation results showcase that our best model (BERT finetuned model with frozen parameters plus BiLSTM layers) achieves state-of-the-art results towards COVID-19 fake news detection task. We also explore keywords evaluation methods using our best model and evaluate the model performance after removing keywords.

[1]  Deniz Yuret,et al.  KUISAIL at SemEval-2020 Task 12: BERT-CNN for Offensive Speech Identification in Social Media , 2020, SEMEVAL.

[2]  Sharad Verma,et al.  Classification of Fake News by Fine-tuning Deep Bidirectional Transformers based Language Model , 2018, EAI Endorsed Trans. Scalable Inf. Syst..

[3]  Rohan Sukumaran,et al.  Hostility Detection and Covid-19 Fake News Detection in Social Media , 2021, ArXiv.

[4]  Xuanjing Huang,et al.  How to Fine-Tune BERT for Text Classification? , 2019, CCL.

[5]  Chao Liu,et al.  A Two-Stage Model Based on BERT for Short Fake News Detection , 2019, KSEM.

[6]  Siddharth Kumar,et al.  Two Stage Transformer Model for COVID-19 Fake News Detection and Fact Checking , 2020, NLP4IF.

[7]  Jimmy J. Lin,et al.  DocBERT: BERT for Document Classification , 2019, ArXiv.

[8]  Radhika Mamidi,et al.  Transformer based Automatic COVID-19 Fake News Detection System , 2021, ArXiv.

[9]  Priyanka Gandhi,et al.  Progress Notes Classification and Keyword Extraction using Attention-based Deep Learning Models with BERT , 2019, ArXiv.

[10]  Michał Choraś,et al.  Application of the BERT-Based Architecture in Fake News Detection , 2020, CISIS.

[11]  Rohit Kumar Kaliyar,et al.  FakeBERT: Fake news detection in social media with a BERT-based deep learning approach , 2021, Multim. Tools Appl..

[12]  Bao-Tran Pham-Hong,et al.  PGSG at SemEval-2020 Task 12: BERT-LSTM with Tweets’ Pretrained Model and Noisy Student Training Method , 2020, SEMEVAL.

[13]  Ming-Wei Chang,et al.  BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding , 2019, NAACL.