Transformer-Based Question Answering Model for the Biomedical Domain
Motivation: Question Answering (QA) is a highly focused topic in the field of Natural Language Processing (NLP). Recent progress in neural network models and the availability of large datasets like SQuAD have played a significant role in improving performance in open domains. However, there remains...
Saved in:
Published in | 2023 5th International Conference on Pattern Analysis and Intelligent Systems (PAIS) pp. 1 - 6 |
---|---|
Main Authors | , , |
Format | Conference Proceeding |
Language | English |
Published |
IEEE
25.10.2023
|
Subjects | |
Online Access | Get full text |
DOI | 10.1109/PAIS60821.2023.10322055 |
Cover
Loading…
Summary: | Motivation: Question Answering (QA) is a highly focused topic in the field of Natural Language Processing (NLP). Recent progress in neural network models and the availability of large datasets like SQuAD have played a significant role in improving performance in open domains. However, there remains a need to further effectively implement these systems in more specific domains, especially in the biomedical field, to help medical practitioners provide accurate solutions for inquiries related to medicine and healthcare, including specific subjects such as the COVID-19 disease. Fortunately, recent models, such as transformers, have opened up avenues and modern techniques for developing accurate systems.Aims: In this work, we aim to leverage transformer models and Transfer Learning to effectively train models in the biomedical domain. By taking a pre-trained model for Question Answering tasks and further fine-tuning it on specific domains, we enhance the system's performance in the biomedical domain. Our ultimate goal is to develop a QA model specifically tailored for COVID-19 QA.Results: We have trained BERT and RoBERTa models on the COVID-QA dataset and achieved competitive results on COVID-19 QA. Our RoBERTa model achieved an Exact Match (EM)/F1 score of 0.38/0.64, respectively, on COVID-QA, indicating successful performance in COVID-19 QA. |
---|---|
DOI: | 10.1109/PAIS60821.2023.10322055 |