Document-level Clinical Entity and Relation Extraction via Knowledge Base-Guided Generation
Generative pre-trained transformer (GPT) models have shown promise in clinical entity and relation extraction tasks because of their precise extraction and contextual understanding capability. In this work, we further leverage the Unified Medical Language System (UMLS) knowledge base to accurately i...
Saved in:
Main Authors | , , , |
---|---|
Format | Journal Article |
Language | English |
Published |
13.07.2024
|
Subjects | |
Online Access | Get full text |
Cover
Loading…
Summary: | Generative pre-trained transformer (GPT) models have shown promise in
clinical entity and relation extraction tasks because of their precise
extraction and contextual understanding capability. In this work, we further
leverage the Unified Medical Language System (UMLS) knowledge base to
accurately identify medical concepts and improve clinical entity and relation
extraction at the document level. Our framework selects UMLS concepts relevant
to the text and combines them with prompts to guide language models in
extracting entities. Our experiments demonstrate that this initial concept
mapping and the inclusion of these mapped concepts in the prompts improves
extraction results compared to few-shot extraction tasks on generic language
models that do not leverage UMLS. Further, our results show that this approach
is more effective than the standard Retrieval Augmented Generation (RAG)
technique, where retrieved data is compared with prompt embeddings to generate
results. Overall, we find that integrating UMLS concepts with GPT models
significantly improves entity and relation identification, outperforming the
baseline and RAG models. By combining the precise concept mapping capability of
knowledge-based approaches like UMLS with the contextual understanding
capability of GPT, our method highlights the potential of these approaches in
specialized domains like healthcare. |
---|---|
DOI: | 10.48550/arxiv.2407.10021 |