Question Answering System for Healthcare Information based on BERT and GPT

Recently, demand for telemedicine counseling has been on the rise. Although telemedicine is legally prohibited in Korea, the number of healthcare-related questions on online question and answer platforms is steadily increasing. However, doctors have to manually write answers to similar questions, which is time-consuming and inefficient. This study implemented a healthcare information provision question and answer system using an algorithm that combines BERT and GPT-2 structures. Intellectual Q&A pairs were used as training data, and the BERT-GPT algorithm experimented with creating sentences that provide healthcare information such as related medical subjects and suspected diseases as answers to healthcare-related questions. As a result of BERT-GPT2 algorithm gets worse PPL score and loss than the baseline model, but better score in the qualitative assessment. This result means BERT-GPT2 understands the conversation intention better.

[1]  A. Lavelli,et al.  Pre-trained transformers: an empirical comparison , 2022, Machine Learning with Applications.

[2]  Matías Busso,et al.  On the demand for telemedicine: Evidence from the COVID‐19 pandemic , 2021, Health economics.

[3]  Ricardo Buettner,et al.  A Systematic Literature Review of Medical Chatbot Research from a Behavior Change Perspective , 2020, 2020 IEEE 44th Annual Computers, Software, and Applications Conference (COMPSAC).

[4]  Nidhin Harilal,et al.  CARO: An Empathetic Health Conversational Chatbot for People with Major Depression , 2020, COMAD/CODS.

[5]  Shubham Sihasane,et al.  Intelligent Healthbot for Transforming Healthcare , 2019, International Journal of Trend in Scientific Research and Development.

[6]  Rashmi Dharwadkar,et al.  A Medical ChatBot , 2018, International Journal of Computer Trends and Technology.

[7]  Lukasz Kaiser,et al.  Attention is All you Need , 2017, NIPS.

[8]  Ming-Wei Chang,et al.  BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding , 2019, NAACL.

[9]  Alec Radford,et al.  Improving Language Understanding by Generative Pre-Training , 2018 .