Biomedical event extraction based on GRU integrating attention mechanism

Biomedical event extraction is a crucial task in biomedical text mining. As the primary forum for international evaluation of different biomedical event extraction technologies, BioNLP Shared Task represents a trend in biomedical text mining toward fine-grained information extraction (IE). The fourt...

Full description

Saved in:
Bibliographic Details
Published inBMC bioinformatics Vol. 19; no. Suppl 9; p. 285
Main Authors Li, Lishuang, Wan, Jia, Zheng, Jieqiong, Wang, Jian
Format Journal Article
LanguageEnglish
Published England BioMed Central Ltd 13.08.2018
BioMed Central
BMC
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Biomedical event extraction is a crucial task in biomedical text mining. As the primary forum for international evaluation of different biomedical event extraction technologies, BioNLP Shared Task represents a trend in biomedical text mining toward fine-grained information extraction (IE). The fourth series of BioNLP Shared Task in 2016 (BioNLP-ST'16) proposed three tasks, in which the Bacteria Biotope event extraction (BB) task has been put forward in the earlier BioNLP-ST. Deep learning methods provide an effective way to automatically extract more complex features and achieve notable results in various natural language processing tasks. The experimental results show that the presented approach can achieve an F-score of 57.42% in the test set, which outperforms previous state-of-the-art official submissions to BioNLP-ST 2016. In this paper, we propose a novel Gated Recurrent Unit Networks framework integrating attention mechanism for extracting biomedical events between biotope and bacteria from biomedical literature, utilizing the corpus from the BioNLP'16 Shared Task on Bacteria Biotope task. The experimental results demonstrate the potential and effectiveness of the proposed framework.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 23
ISSN:1471-2105
1471-2105
DOI:10.1186/s12859-018-2275-2