Investigation of improving the pre-training and fine-tuning of BERT model for biomedical relation extraction
Background Recently, automatically extracting biomedical relations has been a significant subject in biomedical research due to the rapid growth of biomedical literature. Since the adaptation to the biomedical domain, the transformer-based BERT models have produced leading results on many biomedical...
Saved in:
Published in | BMC bioinformatics Vol. 23; no. 1; pp. 120 - 20 |
---|---|
Main Authors | , |
Format | Journal Article |
Language | English |
Published |
London
BioMed Central
04.04.2022
BMC |
Subjects | |
Online Access | Get full text |
ISSN | 1471-2105 1471-2105 |
DOI | 10.1186/s12859-022-04642-w |
Cover
Abstract | Background
Recently, automatically extracting biomedical relations has been a significant subject in biomedical research due to the rapid growth of biomedical literature. Since the adaptation to the biomedical domain, the transformer-based BERT models have produced leading results on many biomedical natural language processing tasks. In this work, we will explore the approaches to improve the BERT model for relation extraction tasks in both the pre-training and fine-tuning stages of its applications. In the pre-training stage, we add another level of BERT adaptation on sub-domain data to bridge the gap between domain knowledge and task-specific knowledge. Also, we propose methods to incorporate the ignored knowledge in the last layer of BERT to improve its fine-tuning.
Results
The experiment results demonstrate that our approaches for pre-training and fine-tuning can improve the BERT model performance. After combining the two proposed techniques, our approach outperforms the original BERT models with averaged F1 score improvement of 2.1% on relation extraction tasks. Moreover, our approach achieves state-of-the-art performance on three relation extraction benchmark datasets.
Conclusions
The extra pre-training step on sub-domain data can help the BERT model generalization on specific tasks, and our proposed fine-tuning mechanism could utilize the knowledge in the last layer of BERT to boost the model performance. Furthermore, the combination of these two approaches further improves the performance of BERT model on the relation extraction tasks. |
---|---|
AbstractList | Recently, automatically extracting biomedical relations has been a significant subject in biomedical research due to the rapid growth of biomedical literature. Since the adaptation to the biomedical domain, the transformer-based BERT models have produced leading results on many biomedical natural language processing tasks. In this work, we will explore the approaches to improve the BERT model for relation extraction tasks in both the pre-training and fine-tuning stages of its applications. In the pre-training stage, we add another level of BERT adaptation on sub-domain data to bridge the gap between domain knowledge and task-specific knowledge. Also, we propose methods to incorporate the ignored knowledge in the last layer of BERT to improve its fine-tuning.BACKGROUNDRecently, automatically extracting biomedical relations has been a significant subject in biomedical research due to the rapid growth of biomedical literature. Since the adaptation to the biomedical domain, the transformer-based BERT models have produced leading results on many biomedical natural language processing tasks. In this work, we will explore the approaches to improve the BERT model for relation extraction tasks in both the pre-training and fine-tuning stages of its applications. In the pre-training stage, we add another level of BERT adaptation on sub-domain data to bridge the gap between domain knowledge and task-specific knowledge. Also, we propose methods to incorporate the ignored knowledge in the last layer of BERT to improve its fine-tuning.The experiment results demonstrate that our approaches for pre-training and fine-tuning can improve the BERT model performance. After combining the two proposed techniques, our approach outperforms the original BERT models with averaged F1 score improvement of 2.1% on relation extraction tasks. Moreover, our approach achieves state-of-the-art performance on three relation extraction benchmark datasets.RESULTSThe experiment results demonstrate that our approaches for pre-training and fine-tuning can improve the BERT model performance. After combining the two proposed techniques, our approach outperforms the original BERT models with averaged F1 score improvement of 2.1% on relation extraction tasks. Moreover, our approach achieves state-of-the-art performance on three relation extraction benchmark datasets.The extra pre-training step on sub-domain data can help the BERT model generalization on specific tasks, and our proposed fine-tuning mechanism could utilize the knowledge in the last layer of BERT to boost the model performance. Furthermore, the combination of these two approaches further improves the performance of BERT model on the relation extraction tasks.CONCLUSIONSThe extra pre-training step on sub-domain data can help the BERT model generalization on specific tasks, and our proposed fine-tuning mechanism could utilize the knowledge in the last layer of BERT to boost the model performance. Furthermore, the combination of these two approaches further improves the performance of BERT model on the relation extraction tasks. Background Recently, automatically extracting biomedical relations has been a significant subject in biomedical research due to the rapid growth of biomedical literature. Since the adaptation to the biomedical domain, the transformer-based BERT models have produced leading results on many biomedical natural language processing tasks. In this work, we will explore the approaches to improve the BERT model for relation extraction tasks in both the pre-training and fine-tuning stages of its applications. In the pre-training stage, we add another level of BERT adaptation on sub-domain data to bridge the gap between domain knowledge and task-specific knowledge. Also, we propose methods to incorporate the ignored knowledge in the last layer of BERT to improve its fine-tuning. Results The experiment results demonstrate that our approaches for pre-training and fine-tuning can improve the BERT model performance. After combining the two proposed techniques, our approach outperforms the original BERT models with averaged F1 score improvement of 2.1% on relation extraction tasks. Moreover, our approach achieves state-of-the-art performance on three relation extraction benchmark datasets. Conclusions The extra pre-training step on sub-domain data can help the BERT model generalization on specific tasks, and our proposed fine-tuning mechanism could utilize the knowledge in the last layer of BERT to boost the model performance. Furthermore, the combination of these two approaches further improves the performance of BERT model on the relation extraction tasks. Recently, automatically extracting biomedical relations has been a significant subject in biomedical research due to the rapid growth of biomedical literature. Since the adaptation to the biomedical domain, the transformer-based BERT models have produced leading results on many biomedical natural language processing tasks. In this work, we will explore the approaches to improve the BERT model for relation extraction tasks in both the pre-training and fine-tuning stages of its applications. In the pre-training stage, we add another level of BERT adaptation on sub-domain data to bridge the gap between domain knowledge and task-specific knowledge. Also, we propose methods to incorporate the ignored knowledge in the last layer of BERT to improve its fine-tuning. The experiment results demonstrate that our approaches for pre-training and fine-tuning can improve the BERT model performance. After combining the two proposed techniques, our approach outperforms the original BERT models with averaged F1 score improvement of 2.1% on relation extraction tasks. Moreover, our approach achieves state-of-the-art performance on three relation extraction benchmark datasets. The extra pre-training step on sub-domain data can help the BERT model generalization on specific tasks, and our proposed fine-tuning mechanism could utilize the knowledge in the last layer of BERT to boost the model performance. Furthermore, the combination of these two approaches further improves the performance of BERT model on the relation extraction tasks. Abstract Background Recently, automatically extracting biomedical relations has been a significant subject in biomedical research due to the rapid growth of biomedical literature. Since the adaptation to the biomedical domain, the transformer-based BERT models have produced leading results on many biomedical natural language processing tasks. In this work, we will explore the approaches to improve the BERT model for relation extraction tasks in both the pre-training and fine-tuning stages of its applications. In the pre-training stage, we add another level of BERT adaptation on sub-domain data to bridge the gap between domain knowledge and task-specific knowledge. Also, we propose methods to incorporate the ignored knowledge in the last layer of BERT to improve its fine-tuning. Results The experiment results demonstrate that our approaches for pre-training and fine-tuning can improve the BERT model performance. After combining the two proposed techniques, our approach outperforms the original BERT models with averaged F1 score improvement of 2.1% on relation extraction tasks. Moreover, our approach achieves state-of-the-art performance on three relation extraction benchmark datasets. Conclusions The extra pre-training step on sub-domain data can help the BERT model generalization on specific tasks, and our proposed fine-tuning mechanism could utilize the knowledge in the last layer of BERT to boost the model performance. Furthermore, the combination of these two approaches further improves the performance of BERT model on the relation extraction tasks. Background Recently, automatically extracting biomedical relations has been a significant subject in biomedical research due to the rapid growth of biomedical literature. Since the adaptation to the biomedical domain, the transformer-based BERT models have produced leading results on many biomedical natural language processing tasks. In this work, we will explore the approaches to improve the BERT model for relation extraction tasks in both the pre-training and fine-tuning stages of its applications. In the pre-training stage, we add another level of BERT adaptation on sub-domain data to bridge the gap between domain knowledge and task-specific knowledge. Also, we propose methods to incorporate the ignored knowledge in the last layer of BERT to improve its fine-tuning. Results The experiment results demonstrate that our approaches for pre-training and fine-tuning can improve the BERT model performance. After combining the two proposed techniques, our approach outperforms the original BERT models with averaged F1 score improvement of 2.1% on relation extraction tasks. Moreover, our approach achieves state-of-the-art performance on three relation extraction benchmark datasets. Conclusions The extra pre-training step on sub-domain data can help the BERT model generalization on specific tasks, and our proposed fine-tuning mechanism could utilize the knowledge in the last layer of BERT to boost the model performance. Furthermore, the combination of these two approaches further improves the performance of BERT model on the relation extraction tasks. |
ArticleNumber | 120 |
Author | Su, Peng Vijay-Shanker, K. |
Author_xml | – sequence: 1 givenname: Peng surname: Su fullname: Su, Peng email: psu@udel.edu organization: Department of Computer and Information Science, Biomedical Text Mining Lab, University of Delaware – sequence: 2 givenname: K. surname: Vijay-Shanker fullname: Vijay-Shanker, K. organization: Department of Computer and Information Science, Biomedical Text Mining Lab, University of Delaware |
BackLink | https://www.ncbi.nlm.nih.gov/pubmed/35379166$$D View this record in MEDLINE/PubMed |
BookMark | eNp9kklv1DAYhi1URBf4AxxQJC5cAt6XCxJUBUaqhITK2XK8pB5l7MFJpvDv8UxaaHvoyfbn9330bafgKOXkAXiN4HuEJP8wIiyZaiHGLaSc4vbmGThBVKAWI8iO7t2Pwek4riFEQkL2AhwTRoRCnJ-AYZV2fpxib6aYU5NDEzfbkncx9c107Ztt8e1UTEz7gEmuCTHVyHx4V_Xnix9XzSY7PzQhl6aLeeNdtGZoih8Wpv9dAXZ_fQmeBzOM_tXteQZ-frm4Ov_WXn7_ujr_dNlaRuHUOkcp4lgahpyVBBMSGAuco8CI7Jzx1NAuGCkDhEEoRqjnDNmOCdWF0AlyBlYL12Wz1tsSN6b80dlEfQjk0mtTpmgHr61jVDEIJWOeKkmNg52hgpDOcGwJr6yPC2s7d7U061OtZngAffiT4rXu805LJSQlsgLe3QJK_jXXXutNHK0fBpN8nkeNORUYYaZwlb59JF3nuaTaqqpiSNVMhaqqN_cz-pfK3VCrQC4CW_I4Fh-0jdNhFvtJDhpBvd8fveyPrvujD_ujb6oVP7Le0Z80kcU0VnHqffmf9hOuvzNe2Yw |
CitedBy_id | crossref_primary_10_1016_j_jmb_2023_168121 crossref_primary_10_1186_s12967_023_04011_y crossref_primary_10_1016_j_jbi_2024_104621 crossref_primary_10_3390_app122010199 crossref_primary_10_1016_j_ipm_2023_103560 crossref_primary_10_1038_s41392_024_01895_0 crossref_primary_10_1007_s10389_023_01921_5 crossref_primary_10_3390_sym15020398 crossref_primary_10_1002_cpt_3053 crossref_primary_10_1007_s10462_024_11042_4 crossref_primary_10_1093_database_baae094 crossref_primary_10_12720_jait_15_6_723_734 crossref_primary_10_4239_wjd_v16_i3_98408 crossref_primary_10_1021_acs_jproteome_4c00535 crossref_primary_10_3390_electronics13132431 crossref_primary_10_3390_app131910814 crossref_primary_10_1186_s12859_024_05951_y |
Cites_doi | 10.1101/2021.05.24.445464 10.1186/gb-2008-9-s2-s4 10.1016/j.jbi.2018.08.005 10.1093/bioinformatics/btz682 10.1007/11550907_126 10.1145/3458754 10.1109/ACCESS.2019.2927253 10.1108/eb046814 10.18653/v1/N18-1202 10.18653/v1/N18-3011 10.18653/v1/2020.acl-main.740 10.1162/neco.1997.9.8.1735 10.18653/v1/P19-1452 10.1016/j.artmed.2004.07.016 10.1038/sdata.2016.35 10.18653/v1/D19-1410 10.1075/term.14.1.05sie 10.1093/nar/gkt441 10.1017/S1351324901002807 10.1109/BIBM49941.2020.9313160 10.21437/Eurospeech.2001-520 10.1016/j.jbi.2013.07.011 10.3115/1218955.1219009 10.18653/v1/W19-5006 |
ContentType | Journal Article |
Copyright | The Author(s) 2022 2022. The Author(s). 2022. This work is licensed under http://creativecommons.org/licenses/by/4.0/ (the “License”). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License. |
Copyright_xml | – notice: The Author(s) 2022 – notice: 2022. The Author(s). – notice: 2022. This work is licensed under http://creativecommons.org/licenses/by/4.0/ (the “License”). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License. |
DBID | C6C AAYXX CITATION CGR CUY CVF ECM EIF NPM 3V. 7QO 7SC 7X7 7XB 88E 8AL 8AO 8FD 8FE 8FG 8FH 8FI 8FJ 8FK ABUWG AEUYN AFKRA ARAPS AZQEC BBNVY BENPR BGLVJ BHPHI CCPQU DWQXO FR3 FYUFA GHDGH GNUQQ HCIFZ JQ2 K7- K9. L7M LK8 L~C L~D M0N M0S M1P M7P P5Z P62 P64 PHGZM PHGZT PIMPY PJZUB PKEHL PPXIY PQEST PQGLB PQQKQ PQUKI PRINS Q9U 7X8 5PM DOA |
DOI | 10.1186/s12859-022-04642-w |
DatabaseName | Springer Nature OA Free Journals CrossRef Medline MEDLINE MEDLINE (Ovid) MEDLINE MEDLINE PubMed ProQuest Central (Corporate) Biotechnology Research Abstracts Computer and Information Systems Abstracts Health & Medical Collection ProQuest Central (purchase pre-March 2016) Medical Database (Alumni Edition) Computing Database (Alumni Edition) ProQuest Pharma Collection Technology Research Database ProQuest SciTech Collection ProQuest Technology Collection ProQuest Natural Science Collection Hospital Premium Collection Hospital Premium Collection (Alumni Edition) ProQuest Central (Alumni) (purchase pre-March 2016) ProQuest Central (Alumni) ProQuest One Sustainability ProQuest Central UK/Ireland Advanced Technologies & Aerospace Collection ProQuest Central Essentials Biological Science Collection ProQuest Central Technology collection Natural Science Collection ProQuest One Community College ProQuest Central Korea Engineering Research Database Health Research Premium Collection Health Research Premium Collection (Alumni) ProQuest Central Student SciTech Premium Collection ProQuest Computer Science Collection Computer Science Database ProQuest Health & Medical Complete (Alumni) Advanced Technologies Database with Aerospace Biological Sciences Computer and Information Systems Abstracts Academic Computer and Information Systems Abstracts Professional Computing Database ProQuest Health & Medical Collection Medical Database Biological Science Database ProQuest advanced technologies & aerospace journals ProQuest Advanced Technologies & Aerospace Collection Biotechnology and BioEngineering Abstracts ProQuest Central Premium ProQuest One Academic (New) Publicly Available Content Database ProQuest Health & Medical Research Collection ProQuest One Academic Middle East (New) ProQuest One Health & Nursing ProQuest One Academic Eastern Edition (DO NOT USE) ProQuest One Applied & Life Sciences ProQuest One Academic ProQuest One Academic UKI Edition ProQuest Central China ProQuest Central Basic MEDLINE - Academic PubMed Central (Full Participant titles) DOAJ Directory of Open Access Journals (ND) |
DatabaseTitle | CrossRef MEDLINE Medline Complete MEDLINE with Full Text PubMed MEDLINE (Ovid) Publicly Available Content Database Computer Science Database ProQuest Central Student ProQuest Advanced Technologies & Aerospace Collection ProQuest Central Essentials ProQuest Computer Science Collection Computer and Information Systems Abstracts SciTech Premium Collection ProQuest Central China ProQuest One Applied & Life Sciences ProQuest One Sustainability Health Research Premium Collection Natural Science Collection Health & Medical Research Collection Biological Science Collection ProQuest Central (New) ProQuest Medical Library (Alumni) Advanced Technologies & Aerospace Collection ProQuest Biological Science Collection ProQuest One Academic Eastern Edition ProQuest Hospital Collection ProQuest Technology Collection Health Research Premium Collection (Alumni) Biological Science Database ProQuest Hospital Collection (Alumni) Biotechnology and BioEngineering Abstracts ProQuest Health & Medical Complete ProQuest One Academic UKI Edition Engineering Research Database ProQuest One Academic ProQuest One Academic (New) Technology Collection Technology Research Database Computer and Information Systems Abstracts – Academic ProQuest One Academic Middle East (New) ProQuest Health & Medical Complete (Alumni) ProQuest Central (Alumni Edition) ProQuest One Community College ProQuest One Health & Nursing ProQuest Natural Science Collection ProQuest Pharma Collection ProQuest Central ProQuest Health & Medical Research Collection Biotechnology Research Abstracts Health and Medicine Complete (Alumni Edition) ProQuest Central Korea Advanced Technologies Database with Aerospace ProQuest Computing ProQuest Central Basic ProQuest Computing (Alumni Edition) ProQuest SciTech Collection Computer and Information Systems Abstracts Professional Advanced Technologies & Aerospace Database ProQuest Medical Library ProQuest Central (Alumni) MEDLINE - Academic |
DatabaseTitleList | MEDLINE - Academic MEDLINE Publicly Available Content Database |
Database_xml | – sequence: 1 dbid: C6C name: Springer Nature OA Free Journals url: http://www.springeropen.com/ sourceTypes: Publisher – sequence: 2 dbid: DOA name: DOAJ Directory of Open Access Journals url: https://www.doaj.org/ sourceTypes: Open Website – sequence: 3 dbid: NPM name: PubMed url: https://proxy.k.utb.cz/login?url=http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?db=PubMed sourceTypes: Index Database – sequence: 4 dbid: EIF name: MEDLINE url: https://proxy.k.utb.cz/login?url=https://www.webofscience.com/wos/medline/basic-search sourceTypes: Index Database – sequence: 5 dbid: 8FG name: ProQuest Technology Collection url: https://search.proquest.com/technologycollection1 sourceTypes: Aggregation Database |
DeliveryMethod | fulltext_linktorsrc |
Discipline | Biology |
EISSN | 1471-2105 |
EndPage | 20 |
ExternalDocumentID | oai_doaj_org_article_cd549500855e4984ad0ba4733ba62c36 PMC8978438 35379166 10_1186_s12859_022_04642_w |
Genre | Journal Article |
GrantInformation_xml | – fundername: NIGMS NIH HHS grantid: U01 GM125267 |
GroupedDBID | --- 0R~ 23N 2WC 53G 5VS 6J9 7X7 88E 8AO 8FE 8FG 8FH 8FI 8FJ AAFWJ AAJSJ AAKPC AASML ABDBF ABUWG ACGFO ACGFS ACIHN ACIWK ACPRK ACUHS ADBBV ADMLS ADUKV AEAQA AENEX AEUYN AFKRA AFPKN AFRAH AHBYD AHMBA AHYZX ALMA_UNASSIGNED_HOLDINGS AMKLP AMTXH AOIJS ARAPS AZQEC BAPOH BAWUL BBNVY BCNDV BENPR BFQNJ BGLVJ BHPHI BMC BPHCQ BVXVI C6C CCPQU CS3 DIK DU5 DWQXO E3Z EAD EAP EAS EBD EBLON EBS EMB EMK EMOBN ESX F5P FYUFA GNUQQ GROUPED_DOAJ GX1 HCIFZ HMCUK HYE IAO ICD IHR INH INR ISR ITC K6V K7- KQ8 LK8 M1P M48 M7P MK~ ML0 M~E O5R O5S OK1 OVT P2P P62 PGMZT PHGZM PHGZT PIMPY PJZUB PPXIY PQGLB PQQKQ PROAC PSQYO PUEGO RBZ RNS ROL RPM RSV SBL SOJ SV3 TR2 TUS UKHRP W2D WOQ WOW XH6 XSB AAYXX ALIPV CITATION CGR CUY CVF ECM EIF NPM 3V. 7QO 7SC 7XB 8AL 8FD 8FK FR3 JQ2 K9. L7M L~C L~D M0N P64 PKEHL PQEST PQUKI PRINS Q9U 7X8 5PM |
ID | FETCH-LOGICAL-c540t-dd441628a51dc83233f55f661f538bdae4a4bfa88f00f79534e651cb579bffb73 |
IEDL.DBID | M48 |
ISSN | 1471-2105 |
IngestDate | Wed Aug 27 00:55:51 EDT 2025 Thu Aug 21 13:33:16 EDT 2025 Thu Sep 04 23:22:41 EDT 2025 Fri Jul 25 10:37:52 EDT 2025 Mon Jul 21 05:46:05 EDT 2025 Tue Jul 01 03:38:34 EDT 2025 Thu Apr 24 23:10:17 EDT 2025 Sat Sep 06 07:27:21 EDT 2025 |
IsDoiOpenAccess | true |
IsOpenAccess | true |
IsPeerReviewed | true |
IsScholarly | true |
Issue | 1 |
Keywords | Deep learning Text mining BERT Transformer Biomedical relation extraction |
Language | English |
License | 2022. The Author(s). Open AccessThis article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/. The Creative Commons Public Domain Dedication waiver (http://creativecommons.org/publicdomain/zero/1.0/) applies to the data made available in this article, unless otherwise stated in a credit line to the data. |
LinkModel | DirectLink |
MergedId | FETCHMERGED-LOGICAL-c540t-dd441628a51dc83233f55f661f538bdae4a4bfa88f00f79534e651cb579bffb73 |
Notes | ObjectType-Article-1 SourceType-Scholarly Journals-1 ObjectType-Feature-2 content type line 14 content type line 23 |
OpenAccessLink | http://journals.scholarsportal.info/openUrl.xqy?doi=10.1186/s12859-022-04642-w |
PMID | 35379166 |
PQID | 2651949579 |
PQPubID | 44065 |
PageCount | 20 |
ParticipantIDs | doaj_primary_oai_doaj_org_article_cd549500855e4984ad0ba4733ba62c36 pubmedcentral_primary_oai_pubmedcentral_nih_gov_8978438 proquest_miscellaneous_2647212592 proquest_journals_2651949579 pubmed_primary_35379166 crossref_citationtrail_10_1186_s12859_022_04642_w crossref_primary_10_1186_s12859_022_04642_w springer_journals_10_1186_s12859_022_04642_w |
ProviderPackageCode | CITATION AAYXX |
PublicationCentury | 2000 |
PublicationDate | 2022-04-04 |
PublicationDateYYYYMMDD | 2022-04-04 |
PublicationDate_xml | – month: 04 year: 2022 text: 2022-04-04 day: 04 |
PublicationDecade | 2020 |
PublicationPlace | London |
PublicationPlace_xml | – name: London – name: England |
PublicationTitle | BMC bioinformatics |
PublicationTitleAbbrev | BMC Bioinformatics |
PublicationTitleAlternate | BMC Bioinformatics |
PublicationYear | 2022 |
Publisher | BioMed Central BMC |
Publisher_xml | – name: BioMed Central – name: BMC |
References | A Radford (4642_CR15) 2019; 1 4642_CR28 4642_CR29 4642_CR26 4642_CR27 4642_CR24 4642_CR25 L Hirschman (4642_CR6) 2001; 7 M Herrero-Zazo (4642_CR2) 2013; 46 P Su (4642_CR11) 2019; 626226 4642_CR5 4642_CR3 F Van Harmelen (4642_CR4) 2008 4642_CR22 4642_CR23 4642_CR20 R Bunescu (4642_CR30) 2005; 33 J Lee (4642_CR17) 2020; 36 4642_CR8 C-H Wei (4642_CR33) 2013; 41 4642_CR7 4642_CR19 4642_CR18 H Zhang (4642_CR10) 2019; 7 4642_CR37 4642_CR16 4642_CR13 4642_CR35 4642_CR14 4642_CR36 S Hochreiter (4642_CR34) 1997; 9 M Krallinger (4642_CR1) 2008; 9 SK Sahu (4642_CR9) 2018; 86 4642_CR12 AE Johnson (4642_CR21) 2016; 3 4642_CR31 MF Porter (4642_CR32) 1980; 14 |
References_xml | – ident: 4642_CR26 doi: 10.1101/2021.05.24.445464 – volume: 9 start-page: 4 issue: 2 year: 2008 ident: 4642_CR1 publication-title: Genome Biol doi: 10.1186/gb-2008-9-s2-s4 – volume: 86 start-page: 15 year: 2018 ident: 4642_CR9 publication-title: J Biomed Inform doi: 10.1016/j.jbi.2018.08.005 – volume: 36 start-page: 1234 issue: 4 year: 2020 ident: 4642_CR17 publication-title: Bioinformatics doi: 10.1093/bioinformatics/btz682 – ident: 4642_CR35 doi: 10.1007/11550907_126 – volume: 626226 start-page: 626226 year: 2019 ident: 4642_CR11 publication-title: BioRxiv – ident: 4642_CR16 – ident: 4642_CR20 doi: 10.1145/3458754 – volume: 7 start-page: 89354 year: 2019 ident: 4642_CR10 publication-title: IEEE Access doi: 10.1109/ACCESS.2019.2927253 – ident: 4642_CR31 – volume: 14 start-page: 130 issue: 3 year: 1980 ident: 4642_CR32 publication-title: Program doi: 10.1108/eb046814 – ident: 4642_CR14 – ident: 4642_CR12 – ident: 4642_CR13 doi: 10.18653/v1/N18-1202 – ident: 4642_CR19 – ident: 4642_CR3 – volume: 1 start-page: 9 issue: 8 year: 2019 ident: 4642_CR15 publication-title: OpenAI Blog – ident: 4642_CR28 – ident: 4642_CR22 doi: 10.18653/v1/N18-3011 – ident: 4642_CR23 doi: 10.18653/v1/2020.acl-main.740 – volume: 9 start-page: 1735 issue: 8 year: 1997 ident: 4642_CR34 publication-title: Neural Comput doi: 10.1162/neco.1997.9.8.1735 – ident: 4642_CR24 – ident: 4642_CR27 doi: 10.18653/v1/P19-1452 – ident: 4642_CR36 – volume-title: Handbook of knowledge representation year: 2008 ident: 4642_CR4 – volume: 33 start-page: 139 issue: 2 year: 2005 ident: 4642_CR30 publication-title: Artif Intell Med doi: 10.1016/j.artmed.2004.07.016 – volume: 3 start-page: 160035 year: 2016 ident: 4642_CR21 publication-title: Sci Data doi: 10.1038/sdata.2016.35 – ident: 4642_CR25 doi: 10.18653/v1/D19-1410 – ident: 4642_CR8 doi: 10.1075/term.14.1.05sie – volume: 41 start-page: 518 issue: W1 year: 2013 ident: 4642_CR33 publication-title: Nucleic Acids Res doi: 10.1093/nar/gkt441 – volume: 7 start-page: 275 issue: 4 year: 2001 ident: 4642_CR6 publication-title: Nat Lang Eng doi: 10.1017/S1351324901002807 – ident: 4642_CR37 doi: 10.1109/BIBM49941.2020.9313160 – ident: 4642_CR5 doi: 10.21437/Eurospeech.2001-520 – ident: 4642_CR29 – volume: 46 start-page: 914 issue: 5 year: 2013 ident: 4642_CR2 publication-title: J Biomed inform doi: 10.1016/j.jbi.2013.07.011 – ident: 4642_CR7 doi: 10.3115/1218955.1219009 – ident: 4642_CR18 doi: 10.18653/v1/W19-5006 |
SSID | ssj0017805 |
Score | 2.514629 |
Snippet | Background
Recently, automatically extracting biomedical relations has been a significant subject in biomedical research due to the rapid growth of biomedical... Recently, automatically extracting biomedical relations has been a significant subject in biomedical research due to the rapid growth of biomedical literature.... Background Recently, automatically extracting biomedical relations has been a significant subject in biomedical research due to the rapid growth of biomedical... Abstract Background Recently, automatically extracting biomedical relations has been a significant subject in biomedical research due to the rapid growth of... |
SourceID | doaj pubmedcentral proquest pubmed crossref springer |
SourceType | Open Website Open Access Repository Aggregation Database Index Database Enrichment Source Publisher |
StartPage | 120 |
SubjectTerms | Adaptation Algorithms BERT Bioinformatics Biomedical and Life Sciences Biomedical relation extraction Biomedical Research - methods Classification Computational Biology/Bioinformatics Computer Appl. in Life Sciences Data Mining - methods Datasets Deep learning Domains Drug interactions Electric Power Supplies Experiments Knowledge Language Life Sciences Medical research Microarrays Natural language Natural Language Processing Performance enhancement Proteins Semantics Text mining Training Transformer |
SummonAdditionalLinks | – databaseName: DOAJ Directory of Open Access Journals (ND) dbid: DOA link: http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwrV3NSxwxFH-IUOiltNW2U21JwZsGM5NkJnOsRRGhHkTBW8gnFWS2rLtI_3tfkpmtW_txKexldrJL8j4mvzfv5fcA9rroMeqygWKMLKiwLaPGOEOVi6hwBLRNTsV8PW9Pr8TZtbx-1Oor1YQVeuAiuEPnMYKRuZwqiF4J45k1ouPcmrZxPJNts55NwdSYP0hM_dMRGdUe3tWJp42myvWUymvo_do2lNn6fwcxn1ZK_pIuzbvQyUt4McJH8rlM-xVshOE1PCsNJX9swe0j2ozZQGaR3EwvDQgiPZJqPqamEMQMnkQEmXSxzNc4-uj44pLk5jgEwSwpZ_OTGsl8LJoj-DCfl8MQ23B1cnz55ZSO_RSoQ1y2oN4j9mkbZWTtHXoy51HKiBt0xKee9SYII2w0SkXGYtdLLkIra2dl19sYbcffwOYwG8I7IC4wlL3nPrZMhBBstCza0LgeQ0hT8wrqSbzajWTjaXm3OgcdqtVFJRpVorNK9H0F-6vffC9UG38dfZS0thqZaLLzF2g8ejQe_S_jqWB30rkeffdON7joXqT0ZQWfVrfR61IqxQxhtkxjBIbOGDo2FbwtJrKaCZe8Q9CNf96tGc_aVNfvDDffMrO3wphecFXBwWRmP6f1Z1G8_x-i2IHnTfEP_OzC5mK-DB8Qby3sx-xaD_zgJ0A priority: 102 providerName: Directory of Open Access Journals – databaseName: ProQuest Technology Collection dbid: 8FG link: http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwfV1Jb9UwEB5BERIX1LKmCzISN7CaxI7jd0IU9VEhwQG1Um-WV6hUJeUtqvj3HTtOymOplEsSJ7Jnsb_xjGcA3rTBodVlPEUbmVNuREm1tppKG5DhCGjr5Ir58lWcnPHP58153nBb5rDKcU5ME7XrbdwjP6wFYg0enUrvr37SWDUqeldzCY378KDClSbKuZx_mrwIMV__eFBGisNlFbO10Ri_Hh16Nb3eWIxSzv5_Ac2_4yX_cJqmtWi-DY8ziCQfBq7vwD3fPYGHQ1nJX0_h8rfkGX1H-kAuxq0DgniPxMiPsTQE0Z0jAaEmXa3TPbY-Ov52SlKJHIKQlgwn9CMzySKHzhGc0hfDkYhncDY_Pv14QnNVBWoRna2oc4iARC11UzmL-sxYaJqAy3TAuc847bnmJmgpQ1mGdtYw7pH01iDlTQimZc9hq-s7_xKI9aVlwjEXRMm99yaYMhhf2xkakrpiBVQjeZXNKcfj8C5VMj2kUANLFLJEJZao6wLeTt9cDQk37mx9FLk2tYzJstODfvFdZd1T1qER3KSIPM9nkmtXGs1bxowWNQ6ggP2R5ypr8FLdylsBr6fXqHvRoaI7369jG44GNBqQdQEvBhGZesIa1iL0xp-3G8Kz0dXNN93Fj5TfW6Jlz5ks4N0oZrfd-j8pdu8exR48qgfJx2sftlaLtT9APLUyr5LS3ABaQh-K priority: 102 providerName: ProQuest – databaseName: Springer Nature OA Free Journals dbid: C6C link: http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwlV1bSx0xEB6sUuhLsfa2aksKfWtDdzeXzT6qKFKoD6LgW8iVCrKnHM9B-u-dZC961AqFfdnNZEkymc03O5MvAF-b6NHrsoGij8wpt7KkxjhDlYuocAS0dQ7F_DqRx-f854W4GGhy0l6Y-_H7Sskf11ViWKMp5zwF4Wp68wI2RMVkDszKgylikLj5x00xT9ZbWXgyP_9ToPJxbuSDAGled4424fUAGMler-E3sBa6LXjZHyH59y1c3SPKmHVkFsnl-JuAILYjKctjPAaCmM6TiLCSLpb5HqX3D0_PSD4OhyB8Jf1u_KQ4Mh_S5Ah-vuf99od3cH50eHZwTIcTFKhDJLag3iPakbUyovIObZexKETEJTnid856E7jhNhqlYlnGphWMBykqZ0XT2hhtw97DejfrwkcgLpSOSc98lCUPIdhoy2hD7Vp0Gk3FCqjG4dVuoBdP3bvS2c1QUvcq0agSnVWibwr4NtX505NrPCu9n7Q2SSZi7PwA54se7Ew7jw6vyNl3gbeKG19awxvGrJE1dqCA3VHnerDWa11jp1ueApYFfJmK0c5S8MR0YbZMMhydZXQW6wI-9FNkagkTrEGYjS9vVibPSlNXS7rL35nLW6EXz5kq4Ps4ze6a9e-h2P4_8R14VfeWgNcurC_my_AJsdTCfs5GdAsNaxjY priority: 102 providerName: Springer Nature |
Title | Investigation of improving the pre-training and fine-tuning of BERT model for biomedical relation extraction |
URI | https://link.springer.com/article/10.1186/s12859-022-04642-w https://www.ncbi.nlm.nih.gov/pubmed/35379166 https://www.proquest.com/docview/2651949579 https://www.proquest.com/docview/2647212592 https://pubmed.ncbi.nlm.nih.gov/PMC8978438 https://doaj.org/article/cd549500855e4984ad0ba4733ba62c36 |
Volume | 23 |
hasFullText | 1 |
inHoldings | 1 |
isFullTextHit | |
isPrint | |
link | http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwfV3da9swED_6wUZfxr7nrgsa7G3T5liybD-MsYRkZdAySgN5E5IsbYVgb25C2_9-J9lOly0dGINtGSTdnfX7-U53AG8yVyLr0pYiR-aUaxFTpYyiuXEocAS0SXDFnJyK4xn_Ok_nO9CXO-om8HIrtfP1pGbN4v31r5tPaPAfg8Hn4sPl0Gdhoz4u3TvqEnq1C_u4MglPxk74rVfB5-_vN85sfe8A7rOUZQiZxMY6FdL5b8Og_4ZS_uVPDcvU9CE86PAl-dwqxCPYsdVjuNdWnLx5Aos_8mrUFakduej_KhCEgsQHhfRVI4iqSuIQhdLlKlxj69Hk7JyE6jkE0S5pN-97OZOmi6oj-LVv2t0ST2E2nZyPj2lXcIEaBG5LWpYIjkSSq3RYGjR1xlyaOlzBHX4WdaksV1w7lecujl1WpIxbkQ6NTrNCO6cz9gz2qrqyL4AYGxsmSlY6EXNrrXY6dtompkCOqYYsgmE_vdJ02cj98BYysJJcyFY6EqUjg3TkVQRv1-_8bHNx_Lf1yEtt3dLn0Q436ua77MxSmhL5cRqC9Swvcq7KWCueMaaVSHAAERz1Mpe9bsoEB11w79-M4PX6MZql97WoytYr34Yjt0ZumUTwvFWRdU96FYsg21Ceja5uPqkufoTU3zmSfs7yCN71anbbrbun4vDOLryEg6TVfzyOYG_ZrOwrRFlLPYDdbJ7hOZ9-GcD-aHL67QyvxmI8CP8tBsG0fgNTdCgD |
linkProvider | Scholars Portal |
linkToHtml | http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwtV1Zb9QwEB6VIkRfEDeBAkaCJ7Ca2M6xDwhRaLWlxwPaSvtmfEKlKmn30Kp_it_I2Em2LEffKuUliRMfM2N_4xnPALwuvUWtSzuKOrKgQhcpVcooWhmPBEdAy6Ip5vCoGB6LL-N8vAY_-7Mwwa2ynxPjRG0bE_bIt1iBWEMEo9KHs3MaskYF62qfQqNli313sUCVbfp-7zPS9w1juzujT0PaZRWgBtHJjFqLCKBglcoza5CfOfd57nGZ8ij72ionlNBeVZVPU18Oci4cVm001qy91yXH_96AmyLsjKP8lOOlgpeF_AD9wZyq2JpmITocDf7ywYDI6GJl8Ys5Av4FbP_2z_zDSBvXvt27cKcDreRjy2X3YM3V9-FWm8by4gGc_haso6lJ48lJv1VBEF-S4GnSp6IgqrbEI7Sls3m8x9LbO19HJKbkIQihSRsRIDAPmXSuegSXkEl7BOMhHF_LeD-C9bqp3RMgxqWGF5ZbX6TCOae9Tr12zAxQcVUZTyDrh1eaLsR56N6pjKpOVciWJBJJIiNJ5CKBt8tvztoAH1eW3g5UW5YMwbnjg2byXXayLo1FpTuPHoBODCqhbKqVKDnXqmDYgQQ2e5rLbsaYykv-TuDV8jXKejDgqNo181BGoMKOCitL4HHLIsuW8JyXCPXx5-UK86w0dfVNffIjxhOvBmUleJXAu57NLpv1_6F4enUvXsLt4ejwQB7sHe0_gw3WSgFem7A-m8zdc8RyM_0iChCBb9ctsb8Aw7JcoQ |
linkToPdf | http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwlV3da9UwFD_oRPFF_LY6NYJvGtY2ado-urnL_BoiG-wt5HMORjvuehn-954kbd3VKQh9aXNakp6c5nd6zvkF4HXtLXpd2lH0kTnlWuRUKaNoYzwqHAFtGUMxX_bF3iH_eFQdXarij9nuU0gy1TQElqZu2DqzPpl4I7bOi8C7RkMmegjNlfTiOtzgYekL4VqxM8cRAmP_VCpz5X1ry1Fk7b8Kav6ZMflb2DSuRou7cGeEkeRd0vs9uOa6-3AzbSz54wGcXqLP6DvSe3Iy_TwgiPhIyP2YNocgqrPEI9ikwyqeo_T27rcDEjfJIQhqSarRD-okyzF5juBHfZmKIh7C4WL3YGePjvsqUIP4bKDWIgYSZaOqwhq0aMZ8VXlcqD1-_bRVjiuuvWoan-e-bivGnagKo6u61d7rmj2Cja7v3BMgxuWGCcusFzl3zmmvc69daVp0JVXBMiim1yvNSDoehncqo_PRCJlUIlElMqpEXmTwZr7nLFFu_FN6O2htlgx02fFCvzyWo_VJY9ENrmJOnuNtw5XNteI1Y1qJEgeQweakczna8LkscdAtD2HMDF7NzWh9IaSiOtevggxHFxpdyDKDx2mKzD1hFasRfOPD67XJs9bV9Zbu5Htk-G7Qt-esyeDtNM1-devvr-Lp_4m_hFtf3y_k5w_7n57B7TIZBR6bsDEsV-45gq1Bv4j29BM0MCQM |
openUrl | ctx_ver=Z39.88-2004&ctx_enc=info%3Aofi%2Fenc%3AUTF-8&rfr_id=info%3Asid%2Fsummon.serialssolutions.com&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=article&rft.atitle=Investigation+of+improving+the+pre-training+and+fine-tuning+of+BERT+model+for+biomedical+relation+extraction&rft.jtitle=BMC+bioinformatics&rft.au=Su%2C+Peng&rft.au=Vijay-Shanker%2C+K&rft.date=2022-04-04&rft.eissn=1471-2105&rft.volume=23&rft.issue=1&rft.spage=120&rft_id=info:doi/10.1186%2Fs12859-022-04642-w&rft_id=info%3Apmid%2F35379166&rft.externalDocID=35379166 |
thumbnail_l | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/lc.gif&issn=1471-2105&client=summon |
thumbnail_m | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/mc.gif&issn=1471-2105&client=summon |
thumbnail_s | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/sc.gif&issn=1471-2105&client=summon |