Contextual embedding bootstrapped neural network for medical information extraction of coronary artery disease records

Coronary artery disease (CAD) is the major cause of human death worldwide. The development of new CAD early diagnosis methods based on medical big data has a great potential to reduce the risk of CAD death. In this process, neural network (NN), as a powerful tool for electronic medical record (EMR)...

Full description

Saved in:
Bibliographic Details
Published inMedical & biological engineering & computing Vol. 59; no. 5; pp. 1111 - 1121
Main Authors Cen, Xingxing, Yuan, Junyi, Pan, Changqing, Tang, Qinhua, Ma, Qunsheng
Format Journal Article
LanguageEnglish
Published Berlin/Heidelberg Springer Berlin Heidelberg 01.05.2021
Springer Nature B.V
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Coronary artery disease (CAD) is the major cause of human death worldwide. The development of new CAD early diagnosis methods based on medical big data has a great potential to reduce the risk of CAD death. In this process, neural network (NN), as a powerful tool for electronic medical record (EMR) processing, enables extract structured data accurately to unlock medical information and to further improve CAD diagnosis. However, the excessive time and labor caused by dataset’s annotation is the main limitation of its application, especially on the CAD records situation with large natural language text and biomedical professional content. In this study, we present an annotation cost saving NN approach for CAD records, which is bootstrapped by deep language model with contextual embedding pre-trained on large unannotated CAD corpus. To demonstrate the feasibility and to further evaluate the performance of our approach, we performed pre-training experiment and term classification experiment, by using the unannotated and annotated CAD records, respectively. The results showed that our contextual embedding bootstrapped NN for CAD records has better performance under the condition of annotations reduction. Graphical abstract
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 23
ISSN:0140-0118
1741-0444
DOI:10.1007/s11517-021-02359-1