Joint relational triple extraction based on potential relation detection and conditional entity mapping
Joint relational triple extraction treats entity recognition and relation extraction as a joint task to extract relational triples, and this is a critical task in information extraction and knowledge graph construction. However, most existing joint models still fall short in terms of extracting over...
Saved in:
Published in | Applied intelligence (Dordrecht, Netherlands) Vol. 53; no. 24; pp. 29656 - 29676 |
---|---|
Main Authors | , , , |
Format | Journal Article |
Language | English |
Published |
New York
Springer US
01.12.2023
Springer Nature B.V |
Subjects | |
Online Access | Get full text |
Cover
Loading…
Abstract | Joint relational triple extraction treats entity recognition and relation extraction as a joint task to extract relational triples, and this is a critical task in information extraction and knowledge graph construction. However, most existing joint models still fall short in terms of extracting overlapping triples. Moreover, these models ignore the trigger words of potential relations during the relation detection process. To address the two issues, a joint model based on
P
otential
R
elation
D
etection and
C
onditional
E
ntity
M
apping is proposed, named PRDCEM. Specifically, the proposed model consists of three components, i.e., potential relation detection, candidate entity tagging, and conditional entity mapping, corresponding to three subtasks. First, a non-autoregressive decoder that contains a cross-attention mechanism is applied to detect potential relations. In this way, different potential relations are associated with the corresponding trigger words in the given sentence, and the semantic representations of the trigger words are fully utilized to encode potential relations. Second, two distinct sequence taggers are employed to extract candidate subjects and objects. Third, an entity mapping module incorporating conditional layer normalization is designed to align the candidate subjects and objects. As such, each candidate subject and each potential relation are combined to form a condition that is incorporated into the sentence, which can effectively extract overlapping triples. Finally, the negative sampling strategy is employed in the entity mapping module to mitigate the error propagation from the previous two components. In a comparison with 15 baselines, the experimental results obtained on two widely used public datasets demonstrate that PRDCEM can effectively extract overlapping triples and achieve improved performance. |
---|---|
AbstractList | Joint relational triple extraction treats entity recognition and relation extraction as a joint task to extract relational triples, and this is a critical task in information extraction and knowledge graph construction. However, most existing joint models still fall short in terms of extracting overlapping triples. Moreover, these models ignore the trigger words of potential relations during the relation detection process. To address the two issues, a joint model based on
P
otential
R
elation
D
etection and
C
onditional
E
ntity
M
apping is proposed, named PRDCEM. Specifically, the proposed model consists of three components, i.e., potential relation detection, candidate entity tagging, and conditional entity mapping, corresponding to three subtasks. First, a non-autoregressive decoder that contains a cross-attention mechanism is applied to detect potential relations. In this way, different potential relations are associated with the corresponding trigger words in the given sentence, and the semantic representations of the trigger words are fully utilized to encode potential relations. Second, two distinct sequence taggers are employed to extract candidate subjects and objects. Third, an entity mapping module incorporating conditional layer normalization is designed to align the candidate subjects and objects. As such, each candidate subject and each potential relation are combined to form a condition that is incorporated into the sentence, which can effectively extract overlapping triples. Finally, the negative sampling strategy is employed in the entity mapping module to mitigate the error propagation from the previous two components. In a comparison with 15 baselines, the experimental results obtained on two widely used public datasets demonstrate that PRDCEM can effectively extract overlapping triples and achieve improved performance. Joint relational triple extraction treats entity recognition and relation extraction as a joint task to extract relational triples, and this is a critical task in information extraction and knowledge graph construction. However, most existing joint models still fall short in terms of extracting overlapping triples. Moreover, these models ignore the trigger words of potential relations during the relation detection process. To address the two issues, a joint model based on Potential Relation Detection and Conditional Entity Mapping is proposed, named PRDCEM. Specifically, the proposed model consists of three components, i.e., potential relation detection, candidate entity tagging, and conditional entity mapping, corresponding to three subtasks. First, a non-autoregressive decoder that contains a cross-attention mechanism is applied to detect potential relations. In this way, different potential relations are associated with the corresponding trigger words in the given sentence, and the semantic representations of the trigger words are fully utilized to encode potential relations. Second, two distinct sequence taggers are employed to extract candidate subjects and objects. Third, an entity mapping module incorporating conditional layer normalization is designed to align the candidate subjects and objects. As such, each candidate subject and each potential relation are combined to form a condition that is incorporated into the sentence, which can effectively extract overlapping triples. Finally, the negative sampling strategy is employed in the entity mapping module to mitigate the error propagation from the previous two components. In a comparison with 15 baselines, the experimental results obtained on two widely used public datasets demonstrate that PRDCEM can effectively extract overlapping triples and achieve improved performance. |
Author | Zhang, Qinghua Wang, Guoyin Gao, Man Zhou, Xiong |
Author_xml | – sequence: 1 givenname: Xiong surname: Zhou fullname: Zhou, Xiong organization: School of Computer Science and Technology, Chongqing University of Posts and Telecommunications, Key Laboratory of Big Data Intelligent Computing, Chongqing University of Posts and Telecommunications, Chongqing Key Laboratory of Computational Intelligence, Chongqing University of Posts and Telecommunications, Key Laboratory of Tourism Multisource Data Perception and Decision, Ministry of Culture and Tourism – sequence: 2 givenname: Qinghua orcidid: 0000-0002-6154-4656 surname: Zhang fullname: Zhang, Qinghua email: zhangqh@cqupt.edu.cn organization: Key Laboratory of Big Data Intelligent Computing, Chongqing University of Posts and Telecommunications, Chongqing Key Laboratory of Computational Intelligence, Chongqing University of Posts and Telecommunications, Key Laboratory of Tourism Multisource Data Perception and Decision, Ministry of Culture and Tourism – sequence: 3 givenname: Man surname: Gao fullname: Gao, Man organization: School of Computer Science and Technology, Chongqing University of Posts and Telecommunications, Key Laboratory of Big Data Intelligent Computing, Chongqing University of Posts and Telecommunications, Chongqing Key Laboratory of Computational Intelligence, Chongqing University of Posts and Telecommunications, Key Laboratory of Tourism Multisource Data Perception and Decision, Ministry of Culture and Tourism – sequence: 4 givenname: Guoyin surname: Wang fullname: Wang, Guoyin organization: Key Laboratory of Big Data Intelligent Computing, Chongqing University of Posts and Telecommunications, Chongqing Key Laboratory of Computational Intelligence, Chongqing University of Posts and Telecommunications, Key Laboratory of Tourism Multisource Data Perception and Decision, Ministry of Culture and Tourism |
BookMark | eNp9kL1qwzAURkVJoUnaF-gk6Kz2ypItayyhvwS6ZOgmFFsKCo7sSgo0b187Til0yKTL5XyfLmeGJr71BqFbCvcUQDxECryUBDJGIKeUEn6BpjQXjAguxQRNQWacFIX8vEKzGLcAwBjQKdq8t84nHEyjk2u9bnAKrmsMNt8p6GrY4bWOpsb90LXJ-OR66JfHtUlmpLSvcdX62p16BjId8E53nfOba3RpdRPNzemdo9Xz02rxSpYfL2-LxyWpGJWJ5KJkGYWS10wKXYpa2pxZZoy0tuaFFmvD6VraSkNWGJvrnFUc1lCIYcvZHN2NtV1ov_YmJrVt96E_J6pMggDKM5n3VDZSVWhjDMaqLridDgdFQQ0-1ehT9T7V0acaqst_ocqlo4RelGvOR9kYjf0_fmPC31VnUj-TvI45 |
CitedBy_id | crossref_primary_10_1007_s40747_024_01518_9 |
Cites_doi | 10.18653/v1/2022.acl-long.337 10.18653/v1/2021.emnlp-main.635 10.18653/v1/P19-1136 10.18653/v1/P17-1017 10.3233/FAIA200356 10.3233/SHTI220418 10.1609/aaai.v34i05.6495 10.3115/1690219.1690287 10.18653/v1/2020.coling-main.138 10.1109/TNNLS.2023.3264735 10.1016/j.knosys.2021.107298 10.18653/v1/2022.naacl-main.48 10.18653/v1/D19-1035 10.3115/v1/P14-1038 10.1145/3038912.3052708 10.18653/v1/P17-1113 10.18653/v1/N19-1423 10.18653/v1/2020.acl-main.136 10.18653/v1/P16-2034 10.1007/978-3-642-15939-8_10 10.24963/ijcai.2020/561 10.1186/s12859-017-1609-9 10.24963/ijcai.2020/546 10.18653/v1/P16-1105 10.18653/v1/P19-1129 10.1609/aaai.v29i1.9491 10.1007/s10489-021-02699-3 10.18653/v1/D18-1360 10.1016/j.knosys.2023.110550 10.18653/v1/2021.naacl-main.5 10.18653/v1/P18-1047 10.18653/v1/2021.acl-long.486 10.3115/v1/P14-1090 10.1609/aaai.v34i05.6363 10.1007/s10489-021-02600-2 10.18653/v1/P16-1200 10.3115/v1/D14-1200 10.3233/FAIA200321 10.18653/v1/2020.coling-main.8 10.18653/v1/N19-1078 10.18653/v1/P19-1525 |
ContentType | Journal Article |
Copyright | The Author(s), under exclusive licence to Springer Science+Business Media, LLC, part of Springer Nature 2023. Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law. |
Copyright_xml | – notice: The Author(s), under exclusive licence to Springer Science+Business Media, LLC, part of Springer Nature 2023. Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law. |
DBID | AAYXX CITATION 3V. 7SC 7WY 7WZ 7XB 87Z 8AL 8FD 8FE 8FG 8FK 8FL ABJCF ABUWG AFKRA ARAPS AZQEC BENPR BEZIV BGLVJ CCPQU DWQXO FRNLG F~G GNUQQ HCIFZ JQ2 K60 K6~ K7- L.- L6V L7M L~C L~D M0C M0N M7S P5Z P62 PHGZM PHGZT PKEHL PQBIZ PQBZA PQEST PQGLB PQQKQ PQUKI PRINS PSYQQ PTHSS Q9U |
DOI | 10.1007/s10489-023-05111-4 |
DatabaseName | CrossRef ProQuest Central (Corporate) Computer and Information Systems Abstracts ABI/INFORM Collection ABI/INFORM Global (PDF only) ProQuest Central (purchase pre-March 2016) ABI/INFORM Collection Computing Database (Alumni Edition) Technology Research Database ProQuest SciTech Collection ProQuest Technology Collection ProQuest Central (Alumni) (purchase pre-March 2016) ABI/INFORM Collection (Alumni Edition) Materials Science & Engineering Collection ProQuest Central (Alumni) ProQuest Central UK/Ireland Advanced Technologies & Aerospace Collection ProQuest Central Essentials ProQuest Central Business Premium Collection Technology Collection ProQuest One ProQuest Central Korea Business Premium Collection (Alumni) ABI/INFORM Global (Corporate) ProQuest Central Student SciTech Premium Collection ProQuest Computer Science Collection ProQuest Business Collection (Alumni Edition) ProQuest Business Collection Computer Science Database ABI/INFORM Professional Advanced ProQuest Engineering Collection Advanced Technologies Database with Aerospace Computer and Information Systems Abstracts Academic Computer and Information Systems Abstracts Professional ABI/INFORM Global Computing Database Engineering Database Advanced Technologies & Aerospace Database ProQuest Advanced Technologies & Aerospace Collection ProQuest Central Premium ProQuest One Academic ProQuest One Academic Middle East (New) ProQuest One Business ProQuest One Business (Alumni) ProQuest One Academic Eastern Edition (DO NOT USE) ProQuest One Applied & Life Sciences ProQuest One Academic ProQuest One Academic UKI Edition ProQuest Central China ProQuest One Psychology Engineering Collection ProQuest Central Basic |
DatabaseTitle | CrossRef ProQuest Business Collection (Alumni Edition) ProQuest One Psychology Computer Science Database ProQuest Central Student ProQuest Advanced Technologies & Aerospace Collection ProQuest Central Essentials ProQuest Computer Science Collection Computer and Information Systems Abstracts SciTech Premium Collection ProQuest Central China ABI/INFORM Complete ProQuest One Applied & Life Sciences ProQuest Central (New) Engineering Collection Advanced Technologies & Aerospace Collection Business Premium Collection ABI/INFORM Global Engineering Database ProQuest One Academic Eastern Edition ProQuest Technology Collection ProQuest Business Collection ProQuest One Academic UKI Edition ProQuest One Academic ProQuest One Academic (New) ABI/INFORM Global (Corporate) ProQuest One Business Technology Collection Technology Research Database Computer and Information Systems Abstracts – Academic ProQuest One Academic Middle East (New) ProQuest Central (Alumni Edition) ProQuest One Community College ProQuest Central ABI/INFORM Professional Advanced ProQuest Engineering Collection ProQuest Central Korea Advanced Technologies Database with Aerospace ABI/INFORM Complete (Alumni Edition) ProQuest Computing ABI/INFORM Global (Alumni Edition) ProQuest Central Basic ProQuest Computing (Alumni Edition) ProQuest SciTech Collection Computer and Information Systems Abstracts Professional Advanced Technologies & Aerospace Database Materials Science & Engineering Collection ProQuest One Business (Alumni) ProQuest Central (Alumni) Business Premium Collection (Alumni) |
DatabaseTitleList | ProQuest Business Collection (Alumni Edition) |
Database_xml | – sequence: 1 dbid: 8FG name: ProQuest Technology Collection url: https://search.proquest.com/technologycollection1 sourceTypes: Aggregation Database |
DeliveryMethod | fulltext_linktorsrc |
Discipline | Computer Science |
EISSN | 1573-7497 |
EndPage | 29676 |
ExternalDocumentID | 10_1007_s10489_023_05111_4 |
GrantInformation_xml | – fundername: Foundation for Innovative Research Groups of Natural Science Foundation of Chongqing grantid: No.cstc2019jcyjcxttX0002 – fundername: National Key Research and Development Program of China grantid: No.2020YFC2003502 funderid: http://dx.doi.org/10.13039/501100012166 – fundername: ey Cooperation Project of Chongqing Municipal Education Commission grantid: HZ2021008 – fundername: National Natural Science Foundation of China grantid: No.62276038; No.62221005 funderid: http://dx.doi.org/10.13039/501100001809 |
GroupedDBID | -4Z -59 -5G -BR -EM -Y2 -~C -~X .86 .DC .VR 06D 0R~ 0VY 1N0 1SB 2.D 203 23M 28- 2J2 2JN 2JY 2KG 2LR 2P1 2VQ 2~H 30V 3V. 4.4 406 408 409 40D 40E 5GY 5QI 5VS 67Z 6NX 77K 7WY 8FE 8FG 8FL 8TC 8UJ 95- 95. 95~ 96X AAAVM AABHQ AACDK AAHNG AAIAL AAJBT AAJKR AANZL AAOBN AARHV AARTL AASML AATNV AATVU AAUYE AAWCG AAYIU AAYQN AAYTO AAYZH ABAKF ABBBX ABBXA ABDZT ABECU ABFTV ABHLI ABHQN ABIVO ABJCF ABJNI ABJOX ABKCH ABKTR ABMNI ABMQK ABNWP ABQBU ABQSL ABSXP ABTAH ABTEG ABTHY ABTKH ABTMW ABULA ABUWG ABWNU ABXPI ACAOD ACBXY ACDTI ACGFS ACHSB ACHXU ACIWK ACKNC ACMDZ ACMLO ACOKC ACOMO ACPIV ACSNA ACZOJ ADHHG ADHIR ADIMF ADINQ ADKNI ADKPE ADRFC ADTPH ADURQ ADYFF ADZKW AEBTG AEFIE AEFQL AEGAL AEGNC AEJHL AEJRE AEKMD AEMSY AENEX AEOHA AEPYU AESKC AETLH AEVLU AEXYK AFBBN AFEXP AFGCZ AFKRA AFLOW AFQWF AFWTZ AFZKB AGAYW AGDGC AGGDS AGJBK AGMZJ AGQEE AGQMX AGRTI AGWIL AGWZB AGYKE AHAVH AHBYD AHKAY AHSBF AHYZX AIAKS AIGIU AIIXL AILAN AITGF AJBLW AJRNO AJZVZ ALMA_UNASSIGNED_HOLDINGS ALWAN AMKLP AMXSW AMYLF AMYQR AOCGG ARAPS ARMRJ ASPBG AVWKF AXYYD AYJHY AZFZN AZQEC B-. BA0 BBWZM BDATZ BENPR BEZIV BGLVJ BGNMA BPHCQ BSONS CAG CCPQU COF CS3 CSCUP DDRTE DL5 DNIVK DPUIP DWQXO EBLON EBS EIOEI EJD ESBYG FEDTE FERAY FFXSO FIGPU FINBP FNLPD FRNLG FRRFC FSGXE FWDCC GGCAI GGRSB GJIRD GNUQQ GNWQR GQ6 GQ7 GQ8 GROUPED_ABI_INFORM_COMPLETE GXS H13 HCIFZ HF~ HG5 HG6 HMJXF HQYDN HRMNR HVGLF HZ~ I09 IHE IJ- IKXTQ ITM IWAJR IXC IZIGR IZQ I~X I~Z J-C J0Z JBSCW JCJTX JZLTJ K60 K6V K6~ K7- KDC KOV KOW L6V LAK LLZTM M0C M0N M4Y M7S MA- N2Q N9A NB0 NDZJH NPVJJ NQJWS NU0 O9- O93 O9G O9I O9J OAM OVD P19 P2P P62 P9O PF0 PQBIZ PQBZA PQQKQ PROAC PSYQQ PT4 PT5 PTHSS Q2X QOK QOS R4E R89 R9I RHV RNI RNS ROL RPX RSV RZC RZE RZK S16 S1Z S26 S27 S28 S3B SAP SCJ SCLPG SCO SDH SDM SHX SISQX SJYHP SNE SNPRN SNX SOHCF SOJ SPISZ SRMVM SSLCW STPWE SZN T13 T16 TEORI TSG TSK TSV TUC U2A UG4 UOJIU UTJUX UZXMN VC2 VFIZW W23 W48 WK8 YLTOR Z45 Z7R Z7X Z7Z Z81 Z83 Z88 Z8M Z8N Z8R Z8T Z8U Z8W Z92 ZMTXR ZY4 ~A9 ~EX AAPKM AAYXX ABBRH ABDBE ABFSG ACSTC ADHKG ADKFA AEZWR AFDZB AFHIU AFOHR AGQPQ AHPBZ AHWEU AIXLP ATHPR AYFIA CITATION PHGZM PHGZT 7SC 7XB 8AL 8FD 8FK ABRTQ JQ2 L.- L7M L~C L~D PKEHL PQEST PQGLB PQUKI PRINS Q9U |
ID | FETCH-LOGICAL-c319t-578321084d397a87d9f53f3ee9ffd46a7be41b9fca026ef5a53c40b0671b9f43 |
IEDL.DBID | BENPR |
ISSN | 0924-669X |
IngestDate | Sat Aug 23 13:33:28 EDT 2025 Tue Jul 01 03:31:58 EDT 2025 Thu Apr 24 22:59:04 EDT 2025 Fri Feb 21 02:40:52 EST 2025 |
IsPeerReviewed | true |
IsScholarly | true |
Issue | 24 |
Keywords | Entity mapping Joint relational triple extraction Conditional layer normalization Potential relation detection |
Language | English |
LinkModel | DirectLink |
MergedId | FETCHMERGED-LOGICAL-c319t-578321084d397a87d9f53f3ee9ffd46a7be41b9fca026ef5a53c40b0671b9f43 |
Notes | ObjectType-Article-1 SourceType-Scholarly Journals-1 ObjectType-Feature-2 content type line 14 |
ORCID | 0000-0002-6154-4656 |
PQID | 2907014295 |
PQPubID | 326365 |
PageCount | 21 |
ParticipantIDs | proquest_journals_2907014295 crossref_primary_10_1007_s10489_023_05111_4 crossref_citationtrail_10_1007_s10489_023_05111_4 springer_journals_10_1007_s10489_023_05111_4 |
ProviderPackageCode | CITATION AAYXX |
PublicationCentury | 2000 |
PublicationDate | 20231200 2023-12-00 20231201 |
PublicationDateYYYYMMDD | 2023-12-01 |
PublicationDate_xml | – month: 12 year: 2023 text: 20231200 |
PublicationDecade | 2020 |
PublicationPlace | New York |
PublicationPlace_xml | – name: New York – name: Boston |
PublicationSubtitle | The International Journal of Research on Intelligent Systems for Real Life Complex Problems |
PublicationTitle | Applied intelligence (Dordrecht, Netherlands) |
PublicationTitleAbbrev | Appl Intell |
PublicationYear | 2023 |
Publisher | Springer US Springer Nature B.V |
Publisher_xml | – name: Springer US – name: Springer Nature B.V |
References | Zeng D, Zhang H, Liu Q (2020) Copymtl: Copy mechanism for joint extraction of entities and relations with multi-task learning. In: The thirty-fourth AAAI conference on artificial intelligence, pp 9507–9514. https://doi.org/10.1609/aaai.v34i05.6495 Zheng H, Wen R, Chen X, et al (2021) PRGC: Potential relation and global correspondence based joint relational triple extraction. In: Proceedings of the 59th annual meeting of the association for computational linguistics and the 11th international joint conference on natural language processing (vol 1: Long Papers), pp 6225–6235. https://doi.org/10.18653/v1/2021.acl-long.486 Eberts M, Ulges A (2020) Span-based joint entity and relation extraction with transformer pre-training. In: 24th European conference on artificial intelligence and 10th conference on prestigious applications of artificial intelligence, pp 2006–2013. https://doi.org/10.3233/FAIA200321 Wei Z, Su J, Wang Y, et al (2020) A novel cascade binary tagging framework for relational triple extraction. In: Proceedings of the 58th annual meeting of the association for computational linguistics, pp 1476–1488. https://doi.org/10.18653/v1/2020.acl-main.136 Su J (2019) Conditional text generation based on conditional layer normalization-Scientific Spaces. https://kexue.fm/archives/712 Accessed 07 Aug 2023 Ji B, Yu J, Li S, et al (2020) Span-based joint entity and relation extraction with attention-based span-specific and contextual semantic representations. In: Proceedings of the 28th international conference on computational linguistics, pp 88–99. https://doi.org/10.18653/v1/2020.coling-main.8 Zheng S, Wang F, Bao H, et al (2017) Joint extraction of entities and relations based on a novel tagging scheme. In: Proceedings of the 55th annual meeting of the association for computational linguistics (vol 1: Long Papers), pp 1227–1236. https://doi.org/10.18653/v1/P17-1113 Ba JL, Kiros JR, Hinton GE (2016) Layer Normalization. arXiv:1607.06450 Fu TJ, Li PH, Ma WY (2019) GraphRel: Modeling text as relational graphs for joint entity and relation extraction. In: Proceedings of the 57th annual meeting of the association for computational linguistics, pp 1409–1418. https://doi.org/10.18653/v1/P19-1136 Luo Y, Xiao F, Zhao H (2020) Hierarchical contextualized representation for named entity recognition. In: The thirty-fourth AAAI conference on artificial intelligence, pp 8441–8448. https://doi.org/10.1609/aaai.v34i05.6363 Zhong Z, Chen D (2021) A frustratingly easy approach for entity and relation extraction. In: Proceedings of the 2021 conference of the north american chapter of the association for computational linguistics: human language technologies, pp 50–61. https://doi.org/10.18653/v1/2021.naacl-main.5 Zeng D, Liu K, Lai S, et al (2014) Relation classification via convolutional deep neural network. In: Proceedings of COLING 2014, the 25th international conference on computational linguistics: technical papers, pp 2335–2344. https://aclanthology.org/C14-1220 Vaswani A, Shazeer N, Parmar N, et al (2017) Attention is all you need. In: Advances in neural information processing systems 30: annual conference on neural information processing systems, pp 5998–6008. https://proceedings.neurips.cc/paper/2017/file/3f5ee243547dee91fbd053c1c4a845aa-Paper.pdf PfeiferBHolzingerASchimekMGRobust random forest-based all-relevant feature ranks for trustworthy aiStud Health Technol Inform202229413713810.3233/SHTI220418 Zeng X, He S, Zeng D, et al (2019) Learning the extraction order of multiple relational facts in a sentence with reinforcement learning. In: Proceedings of the 2019 conference on empirical methods in natural language processing and the 9th international joint conference on natural language processing, pp 367–377. 10.18653/v1/D19-1035 LiFZhangMFuGA neural joint model for entity and relation extraction from biomedical textBMC Bioinformatics201718119810.1186/s12859-017-1609-9 Ye D, Lin Y, Li P, et al (2022) Packed levitated marker for entity and relation extraction. In: Proceedings of the 60th annual meeting of the association for computational linguistics (vol 1: Long Papers), pp 4904–4917. https://doi.org/10.18653/v1/2022.acl-long.337 Miwa M, Bansal M (2016) End-to-end relation extraction using LSTMs on sequences and tree structures. In: Proceedings of the 54th annual meeting of the association for computational linguistics (vol 1: Long Papers), pp 1105–1116. https://doi.org/10.18653/v1/P16-1105 Xu B, Wang Q, Lyu Y, et al (2022) EmRel: Joint representation of entities and embedded relations for multi-triple extraction. In: Proceedings of the 2022 conference of the north american chapter of the association for computational linguistics: human language technologies, pp 659–665. https://doi.org/10.18653/v1/2022.naacl-main.48 Akbik A, Bergmann T, Vollgraf R (2019) Pooled contextualized embeddings for named entity recognition. In: Proceedings of the 2019 conference of the north american chapter of the association for computational linguistics: human language technologies, vol 1 (Long and Short Papers), pp 724–728. https://doi.org/10.18653/v1/N19-1078 Li X, Li Y, Yang J, et al (2022) A relation aware embedding mechanism for relation extraction. Appl Intell 52(9):10,022–10,031. https://doi.org/10.1007/s10489-021-02699-3 Zhao T, Yan Z, Cao Y, et al (2020) Asking effective and diverse questions: A machine reading comprehension based framework for joint entity-relation extraction. In: Proceedings of the twenty-ninth international joint conference on artificial intelligence, pp 3948–3954. https://doi.org/10.24963/ijcai.2020/546 Yuan Y, Zhou X, Pan S, et al (2020) A relation-specific attention network for joint entity and relation extraction. In: Proceedings of the twenty-ninth international joint conference on artificial intelligence, pp 4054–4060. https://doi.org/10.24963/ijcai.2020/561 GaoCZhangXLiLERGM: A multi-stage joint entity and relation extraction with global entity matchKnowl-Based Syst202327111055010.1016/j.knosys.2023.110550 Riedel S, Yao L, McCallum A (2010) Modeling Relations and Their Mentions without Labeled Text. In: Machine learning and knowledge discovery in databases, pp 148–163. https://doi.org/10.1007/978-3-642-15939-8_10 Luan Y, He L, Ostendorf M, et al (2018) Multi-task identification of entities, relations, and coreference for scientific knowledge graph construction. In: Proceedings of the 2018 conference on empirical methods in natural language processing, pp 3219–3232. https://doi.org/10.18653/v1/D18-1360 Lin Y, Shen S, Liu Z, et al (2016) Neural relation extraction with selective attention over instances. In: Proceedings of the 54th annual meeting of the association for computational linguistics (vol 1: Long Papers), pp 2124–2133. https://doi.org/10.18653/v1/P16-1200 Zhou P, Shi W, Tian J, et al (2016) Attention-based bidirectional long short-term memory networks for relation classification. In: Proceedings of the 54th annual meeting of the association for computational linguistics (vol 2: Short Papers), pp 207–212. https://doi.org/10.18653/v1/P16-2034 Gardent C, Shimorina A, Narayan S, et al (2017) Creating training corpora for NLG micro-planners. In: Proceedings of the 55th annual meeting of the association for computational linguistics (vol 1: Long Papers), pp 179–188. https://doi.org/10.18653/v1/P17-1017 Li X, Yin F, Sun Z, et al (2019) Entity-relation extraction as multi-turn question answering. In: Proceedings of the 57th annual meeting of the association for computational linguistics, pp 1340–1350. https://doi.org/10.18653/v1/P19-1129 Yu B, Zhang Z, Shu X, et al (2020) Joint extraction of entities and relations based on a novel decomposition strategy. In: 24th European conference on artificial intelligence and 10th conference on prestigious applications of artificial intelligence, pp 2282–2289. https://doi.org/10.3233/FAIA200356 Sui D, Zeng X, Chen Y, et al (2023) Joint Entity and Relation Extraction With Set Prediction Networks. IEEE Trans Neural Netw Learn Syst pp 1–12. 10.1109/TNNLS.2023.3264735 Li X, Luo X, Dong C, et al (2021) TDEER: An efficient translating decoding schema for joint extraction of entities and relations. In: Proceedings of the 2021 conference on empirical methods in natural language processing, pp 8055–8064. https://doi.org/10.18653/v1/2021.emnlp-main.635 Mintz M, Bills S, Snow R, et al (2009) Distant supervision for relation extraction without labeled data. In: Proceedings of the joint conference of the 47th annual meeting of the ACL and the 4th international joint conference on natural language processing of the AFNLP, pp 1003–1011. https://aclanthology.org/P09-1113 Liu Y, Ott M, Goyal N, et al (2019) RoBERTa: A Robustly Optimized BERT Pretraining Approach. arXiv:1907.11692 Devlin J, Chang MW, Lee K, et al (2019) BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 conference of the north american chapter of the association for computational linguistics: human language technologies, vol 1 (Long and Short Papers), pp 4171–4186. https://doi.org/10.18653/v1/N19-1423 Dixit K, Al-Onaizan Y (2019) Span-level model for relation extraction. In: Proceedings of the 57th annual meeting of the association for computational linguistics, pp 5308–5314. https://doi.org/10.18653/v1/P19-1525 Yao X, Van Durme B (2014) Information extraction over structured data: Question answering with Freebase. In: Proceedings of the 52nd annual meeting of the association for computational linguistics (vol 1: Long Papers), pp 956–966. https://doi.org/10.3115/v1/P14-1090 Li Q, Ji H (2014) Incremental joint extraction of entity mentions and relations. In: Proceedings of the 52nd annual meeting of the association for computational linguistics (vol 1: Long Papers), pp 402–412. https://doi.org/10.3115/v1/P14-1038 Wang Y, Yu B, Zhang Y, et al (2020) TPLinker: Single-stage joint extraction of entities and relations through token pair linking. In: Proceedings of the 28th international conference on c B Pfeifer (5111_CR31) 2022; 294 5111_CR41 5111_CR40 5111_CR5 5111_CR4 5111_CR3 5111_CR2 5111_CR9 5111_CR8 5111_CR6 5111_CR29 5111_CR28 5111_CR25 Q Wan (5111_CR35) 2021; 228 5111_CR24 5111_CR46 5111_CR27 5111_CR26 T Lai (5111_CR45) 2022; 52 5111_CR21 5111_CR43 5111_CR20 5111_CR42 5111_CR23 5111_CR22 5111_CR44 C Gao (5111_CR37) 2023; 271 5111_CR30 F Li (5111_CR7) 2017; 18 5111_CR1 5111_CR18 5111_CR17 5111_CR39 5111_CR19 5111_CR14 5111_CR36 5111_CR13 5111_CR16 5111_CR38 5111_CR15 5111_CR10 5111_CR32 5111_CR12 5111_CR34 5111_CR11 5111_CR33 |
References_xml | – reference: Zeng D, Liu K, Lai S, et al (2014) Relation classification via convolutional deep neural network. In: Proceedings of COLING 2014, the 25th international conference on computational linguistics: technical papers, pp 2335–2344. https://aclanthology.org/C14-1220 – reference: GaoCZhangXLiLERGM: A multi-stage joint entity and relation extraction with global entity matchKnowl-Based Syst202327111055010.1016/j.knosys.2023.110550 – reference: Lin Y, Liu Z, Sun M, et al (2015) Learning entity and relation embeddings for knowledge graph completion. In: Proceedings of the twenty-ninth AAAI conference on artificial intelligence, pp 2181–2187. https://doi.org/10.1609/aaai.v29i1.9491 – reference: LiFZhangMFuGA neural joint model for entity and relation extraction from biomedical textBMC Bioinformatics201718119810.1186/s12859-017-1609-9 – reference: Zeng D, Zhang H, Liu Q (2020) Copymtl: Copy mechanism for joint extraction of entities and relations with multi-task learning. In: The thirty-fourth AAAI conference on artificial intelligence, pp 9507–9514. https://doi.org/10.1609/aaai.v34i05.6495 – reference: Zhou P, Shi W, Tian J, et al (2016) Attention-based bidirectional long short-term memory networks for relation classification. In: Proceedings of the 54th annual meeting of the association for computational linguistics (vol 2: Short Papers), pp 207–212. https://doi.org/10.18653/v1/P16-2034 – reference: Gardent C, Shimorina A, Narayan S, et al (2017) Creating training corpora for NLG micro-planners. In: Proceedings of the 55th annual meeting of the association for computational linguistics (vol 1: Long Papers), pp 179–188. https://doi.org/10.18653/v1/P17-1017 – reference: Zeng X, He S, Zeng D, et al (2019) Learning the extraction order of multiple relational facts in a sentence with reinforcement learning. In: Proceedings of the 2019 conference on empirical methods in natural language processing and the 9th international joint conference on natural language processing, pp 367–377. 10.18653/v1/D19-1035 – reference: Devlin J, Chang MW, Lee K, et al (2019) BERT: Pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 conference of the north american chapter of the association for computational linguistics: human language technologies, vol 1 (Long and Short Papers), pp 4171–4186. https://doi.org/10.18653/v1/N19-1423 – reference: Wei Z, Su J, Wang Y, et al (2020) A novel cascade binary tagging framework for relational triple extraction. In: Proceedings of the 58th annual meeting of the association for computational linguistics, pp 1476–1488. https://doi.org/10.18653/v1/2020.acl-main.136 – reference: Lin Y, Shen S, Liu Z, et al (2016) Neural relation extraction with selective attention over instances. In: Proceedings of the 54th annual meeting of the association for computational linguistics (vol 1: Long Papers), pp 2124–2133. https://doi.org/10.18653/v1/P16-1200 – reference: PfeiferBHolzingerASchimekMGRobust random forest-based all-relevant feature ranks for trustworthy aiStud Health Technol Inform202229413713810.3233/SHTI220418 – reference: Su J (2019) Conditional text generation based on conditional layer normalization-Scientific Spaces. https://kexue.fm/archives/712 Accessed 07 Aug 2023 – reference: Ye D, Lin Y, Li P, et al (2022) Packed levitated marker for entity and relation extraction. In: Proceedings of the 60th annual meeting of the association for computational linguistics (vol 1: Long Papers), pp 4904–4917. https://doi.org/10.18653/v1/2022.acl-long.337 – reference: Li X, Luo X, Dong C, et al (2021) TDEER: An efficient translating decoding schema for joint extraction of entities and relations. In: Proceedings of the 2021 conference on empirical methods in natural language processing, pp 8055–8064. https://doi.org/10.18653/v1/2021.emnlp-main.635 – reference: Zhao T, Yan Z, Cao Y, et al (2020) Asking effective and diverse questions: A machine reading comprehension based framework for joint entity-relation extraction. In: Proceedings of the twenty-ninth international joint conference on artificial intelligence, pp 3948–3954. https://doi.org/10.24963/ijcai.2020/546 – reference: Miwa M, Bansal M (2016) End-to-end relation extraction using LSTMs on sequences and tree structures. In: Proceedings of the 54th annual meeting of the association for computational linguistics (vol 1: Long Papers), pp 1105–1116. https://doi.org/10.18653/v1/P16-1105 – reference: Zhong Z, Chen D (2021) A frustratingly easy approach for entity and relation extraction. In: Proceedings of the 2021 conference of the north american chapter of the association for computational linguistics: human language technologies, pp 50–61. https://doi.org/10.18653/v1/2021.naacl-main.5 – reference: Zheng S, Wang F, Bao H, et al (2017) Joint extraction of entities and relations based on a novel tagging scheme. In: Proceedings of the 55th annual meeting of the association for computational linguistics (vol 1: Long Papers), pp 1227–1236. https://doi.org/10.18653/v1/P17-1113 – reference: Ji B, Yu J, Li S, et al (2020) Span-based joint entity and relation extraction with attention-based span-specific and contextual semantic representations. In: Proceedings of the 28th international conference on computational linguistics, pp 88–99. https://doi.org/10.18653/v1/2020.coling-main.8 – reference: Zheng H, Wen R, Chen X, et al (2021) PRGC: Potential relation and global correspondence based joint relational triple extraction. In: Proceedings of the 59th annual meeting of the association for computational linguistics and the 11th international joint conference on natural language processing (vol 1: Long Papers), pp 6225–6235. https://doi.org/10.18653/v1/2021.acl-long.486 – reference: LaiTChengLWangDRMAN: Relational multi-head attention neural network for joint extraction of entities and relationsAppl Intell20225233132314210.1007/s10489-021-02600-2 – reference: Zeng X, Zeng D, He S, et al (2018) Extracting relational facts by an end-to-end neural model with copy mechanism. In: Proceedings of the 56th annual meeting of the association for computational linguistics (vol 1: Long Papers), pp 506–514. https://doi.org/10.18653/v1/P18-1047 – reference: Miwa M, Sasaki Y (2014) Modeling joint entity and relation extraction with table representation. In: Proceedings of the 2014 conference on empirical methods in natural language processing (EMNLP), pp 1858–1869. https://doi.org/10.3115/v1/D14-1200 – reference: Luan Y, He L, Ostendorf M, et al (2018) Multi-task identification of entities, relations, and coreference for scientific knowledge graph construction. In: Proceedings of the 2018 conference on empirical methods in natural language processing, pp 3219–3232. https://doi.org/10.18653/v1/D18-1360 – reference: Xu B, Wang Q, Lyu Y, et al (2022) EmRel: Joint representation of entities and embedded relations for multi-triple extraction. In: Proceedings of the 2022 conference of the north american chapter of the association for computational linguistics: human language technologies, pp 659–665. https://doi.org/10.18653/v1/2022.naacl-main.48 – reference: Vaswani A, Shazeer N, Parmar N, et al (2017) Attention is all you need. In: Advances in neural information processing systems 30: annual conference on neural information processing systems, pp 5998–6008. https://proceedings.neurips.cc/paper/2017/file/3f5ee243547dee91fbd053c1c4a845aa-Paper.pdf – reference: Mintz M, Bills S, Snow R, et al (2009) Distant supervision for relation extraction without labeled data. In: Proceedings of the joint conference of the 47th annual meeting of the ACL and the 4th international joint conference on natural language processing of the AFNLP, pp 1003–1011. https://aclanthology.org/P09-1113 – reference: Yuan Y, Zhou X, Pan S, et al (2020) A relation-specific attention network for joint entity and relation extraction. In: Proceedings of the twenty-ninth international joint conference on artificial intelligence, pp 4054–4060. https://doi.org/10.24963/ijcai.2020/561 – reference: Luo Y, Xiao F, Zhao H (2020) Hierarchical contextualized representation for named entity recognition. In: The thirty-fourth AAAI conference on artificial intelligence, pp 8441–8448. https://doi.org/10.1609/aaai.v34i05.6363 – reference: Li X, Yin F, Sun Z, et al (2019) Entity-relation extraction as multi-turn question answering. In: Proceedings of the 57th annual meeting of the association for computational linguistics, pp 1340–1350. https://doi.org/10.18653/v1/P19-1129 – reference: Wang Y, Yu B, Zhang Y, et al (2020) TPLinker: Single-stage joint extraction of entities and relations through token pair linking. In: Proceedings of the 28th international conference on computational linguistics, pp 1572–1582. https://doi.org/10.18653/v1/2020.coling-main.138 – reference: Yu B, Zhang Z, Shu X, et al (2020) Joint extraction of entities and relations based on a novel decomposition strategy. In: 24th European conference on artificial intelligence and 10th conference on prestigious applications of artificial intelligence, pp 2282–2289. https://doi.org/10.3233/FAIA200356 – reference: Akbik A, Bergmann T, Vollgraf R (2019) Pooled contextualized embeddings for named entity recognition. In: Proceedings of the 2019 conference of the north american chapter of the association for computational linguistics: human language technologies, vol 1 (Long and Short Papers), pp 724–728. https://doi.org/10.18653/v1/N19-1078 – reference: Sui D, Zeng X, Chen Y, et al (2023) Joint Entity and Relation Extraction With Set Prediction Networks. IEEE Trans Neural Netw Learn Syst pp 1–12. 10.1109/TNNLS.2023.3264735 – reference: Dixit K, Al-Onaizan Y (2019) Span-level model for relation extraction. In: Proceedings of the 57th annual meeting of the association for computational linguistics, pp 5308–5314. https://doi.org/10.18653/v1/P19-1525 – reference: Riedel S, Yao L, McCallum A (2010) Modeling Relations and Their Mentions without Labeled Text. In: Machine learning and knowledge discovery in databases, pp 148–163. https://doi.org/10.1007/978-3-642-15939-8_10 – reference: Liu Y, Ott M, Goyal N, et al (2019) RoBERTa: A Robustly Optimized BERT Pretraining Approach. arXiv:1907.11692 – reference: Eberts M, Ulges A (2020) Span-based joint entity and relation extraction with transformer pre-training. In: 24th European conference on artificial intelligence and 10th conference on prestigious applications of artificial intelligence, pp 2006–2013. https://doi.org/10.3233/FAIA200321 – reference: Li X, Li Y, Yang J, et al (2022) A relation aware embedding mechanism for relation extraction. Appl Intell 52(9):10,022–10,031. https://doi.org/10.1007/s10489-021-02699-3 – reference: Fu TJ, Li PH, Ma WY (2019) GraphRel: Modeling text as relational graphs for joint entity and relation extraction. In: Proceedings of the 57th annual meeting of the association for computational linguistics, pp 1409–1418. https://doi.org/10.18653/v1/P19-1136 – reference: WanQWeiLChenXA region-based hypergraph network for joint entity-relation extractionKnowl-Based Syst202122810729810.1016/j.knosys.2021.107298 – reference: Yao X, Van Durme B (2014) Information extraction over structured data: Question answering with Freebase. In: Proceedings of the 52nd annual meeting of the association for computational linguistics (vol 1: Long Papers), pp 956–966. https://doi.org/10.3115/v1/P14-1090 – reference: Ba JL, Kiros JR, Hinton GE (2016) Layer Normalization. arXiv:1607.06450 – reference: Li Q, Ji H (2014) Incremental joint extraction of entity mentions and relations. In: Proceedings of the 52nd annual meeting of the association for computational linguistics (vol 1: Long Papers), pp 402–412. https://doi.org/10.3115/v1/P14-1038 – reference: Ren X, Wu Z, He W, et al (2017) Cotype: Joint extraction of typed entities and relations with knowledge bases. In: Proceedings of the 26th international conference on world wide web, pp 1015–1024. https://doi.org/10.1145/3038912.3052708 – ident: 5111_CR10 doi: 10.18653/v1/2022.acl-long.337 – ident: 5111_CR13 doi: 10.18653/v1/2021.emnlp-main.635 – ident: 5111_CR34 doi: 10.18653/v1/P19-1136 – ident: 5111_CR44 doi: 10.18653/v1/P17-1017 – ident: 5111_CR14 doi: 10.3233/FAIA200356 – volume: 294 start-page: 137 year: 2022 ident: 5111_CR31 publication-title: Stud Health Technol Inform doi: 10.3233/SHTI220418 – ident: 5111_CR42 – ident: 5111_CR16 doi: 10.1609/aaai.v34i05.6495 – ident: 5111_CR24 doi: 10.3115/1690219.1690287 – ident: 5111_CR23 doi: 10.18653/v1/2020.coling-main.138 – ident: 5111_CR38 doi: 10.1109/TNNLS.2023.3264735 – ident: 5111_CR40 – volume: 228 start-page: 298 issue: 107 year: 2021 ident: 5111_CR35 publication-title: Knowl-Based Syst doi: 10.1016/j.knosys.2021.107298 – ident: 5111_CR36 doi: 10.18653/v1/2022.naacl-main.48 – ident: 5111_CR46 doi: 10.18653/v1/D19-1035 – ident: 5111_CR3 doi: 10.3115/v1/P14-1038 – ident: 5111_CR11 doi: 10.1145/3038912.3052708 – ident: 5111_CR32 doi: 10.18653/v1/P17-1113 – ident: 5111_CR22 doi: 10.18653/v1/N19-1423 – ident: 5111_CR12 doi: 10.18653/v1/2020.acl-main.136 – ident: 5111_CR28 doi: 10.18653/v1/P16-2034 – ident: 5111_CR43 doi: 10.1007/978-3-642-15939-8_10 – ident: 5111_CR30 doi: 10.24963/ijcai.2020/561 – volume: 18 start-page: 198 issue: 1 year: 2017 ident: 5111_CR7 publication-title: BMC Bioinformatics doi: 10.1186/s12859-017-1609-9 – ident: 5111_CR19 doi: 10.24963/ijcai.2020/546 – ident: 5111_CR2 doi: 10.18653/v1/P16-1105 – ident: 5111_CR20 doi: 10.18653/v1/P19-1129 – ident: 5111_CR39 – ident: 5111_CR6 doi: 10.1609/aaai.v29i1.9491 – ident: 5111_CR4 doi: 10.1007/s10489-021-02699-3 – ident: 5111_CR5 doi: 10.18653/v1/D18-1360 – volume: 271 start-page: 550 issue: 110 year: 2023 ident: 5111_CR37 publication-title: Knowl-Based Syst doi: 10.1016/j.knosys.2023.110550 – ident: 5111_CR9 doi: 10.18653/v1/2021.naacl-main.5 – ident: 5111_CR15 doi: 10.18653/v1/P18-1047 – ident: 5111_CR21 doi: 10.18653/v1/2021.acl-long.486 – ident: 5111_CR1 doi: 10.3115/v1/P14-1090 – ident: 5111_CR41 – ident: 5111_CR26 doi: 10.1609/aaai.v34i05.6363 – volume: 52 start-page: 3132 issue: 3 year: 2022 ident: 5111_CR45 publication-title: Appl Intell doi: 10.1007/s10489-021-02600-2 – ident: 5111_CR8 doi: 10.18653/v1/P16-1200 – ident: 5111_CR27 – ident: 5111_CR29 doi: 10.3115/v1/D14-1200 – ident: 5111_CR17 doi: 10.3233/FAIA200321 – ident: 5111_CR18 doi: 10.18653/v1/2020.coling-main.8 – ident: 5111_CR25 doi: 10.18653/v1/N19-1078 – ident: 5111_CR33 doi: 10.18653/v1/P19-1525 |
SSID | ssj0003301 |
Score | 2.3442144 |
Snippet | Joint relational triple extraction treats entity recognition and relation extraction as a joint task to extract relational triples, and this is a critical task... |
SourceID | proquest crossref springer |
SourceType | Aggregation Database Enrichment Source Index Database Publisher |
StartPage | 29656 |
SubjectTerms | Artificial Intelligence Computer Science Information retrieval Knowledge representation Machines Manufacturing Mapping Mechanical Engineering Modules Processes Semantic relations Semantics Sentences Teaching methods Words (language) |
SummonAdditionalLinks | – databaseName: SpringerLink Journals (ICM) dbid: U2A link: http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwlV07T8MwED5BWVh4IwoFeWCDSE1iJ_FYIaqqEkyt1C2yY7uqBGmVhoF_zzmxW0CAxBYlZw93ucdn3wPgViVMZGGM1s8wGVChUKVkWgRSMwwutG0oYs8hn56T0ZSOZ2zmisLWPtvdX0k2lvpTsRu16T2RzTdDDQ3oLuwxxO42kWsaDTb2FxF6MycPkUWQJHzmSmV-3uOrO9rGmN-uRRtvMzyCAxcmkkEr12PY0eUJHPoRDMRp5CnMx8tFWZPKpbThkrqyR-cEbW7V1iwQ66gUwYfVsra5QUjk6YnStW6pRKkIYmO1cPs09bvv5FXYBg7zM5gMHycPo8DNTggKVKo6QEW01TkZVRhwiCxV3LDYxFpzYxRNRCo1DSU3hUAQpg0TLC5oX6Lvsm9pfA6dclnqCyA6i6TkRZoKhdCDJrKvlQmV5DrjmVFZF0LPwbxwfcXteIuXfNsR2XI9R67nDddz2oW7zZpV21XjT-qeF0zuNGydR4jqEd5FnHXh3gtr-_n33S7_R34F-3bCfJvB0oNOXb3pa4xDannT_HYfS6nU_Q priority: 102 providerName: Springer Nature |
Title | Joint relational triple extraction based on potential relation detection and conditional entity mapping |
URI | https://link.springer.com/article/10.1007/s10489-023-05111-4 https://www.proquest.com/docview/2907014295 |
Volume | 53 |
hasFullText | 1 |
inHoldings | 1 |
isFullTextHit | |
isPrint | |
link | http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwfR27TsMw8ATtwsIbUR6VBzaIoImTOBNqUR8CUSEEUpkiO7ZRJUhLCQN_z13iUIEEU6z47OHse_oeACc6CqXoBMj9bKg8LjWSlIozT5kQlQtDBUXID3k7jkaP_HoSTpzD7d2FVdY8sWTUepaRj_zcRysO1Xk_CS_nbx51jaLXVddCYxWayIKFaECz1x_f3X_zYrTWy555aGV4UZRMXNqMS57jFC7kU_waUrzHf4qmpb7564m0lDyDTVh3KiPrVme8BSsm34aNuh0Dc9S5A8_Xs2lesIULb8MlxYLc6Az576LKX2AktDTDwXxWUJwQAtXwTJvCVFAy1wztZD11-5S5vJ_sVVIxh-ddeBj0H65Gnuuj4GVIYIWHREmZOoJrVD6kiHViw8AGxiTWah7JWBneUYnNJBpkxoYyDDJ-oVCO0V8e7EEjn-VmH5gRvlJJFsdSoxnCI3VhtO1olRiRCKtFCzo1BtPM1RinVhcv6bI6MmE9RaynJdZT3oLT7zXzqsLGv9BH9cGkjtre0-XdaMFZfVjL6b93O_h_t0NYo-7yVfTKETSKxYc5Rh2kUG1YFYNhG5rdQa83pu_w6abfdtcPZx_97hcLgt5d |
linkProvider | ProQuest |
linkToHtml | http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwtV1LT8MwDLZ4HODCGzGeOcAJKrY2fR0QQsAYz9OQdouSJpmQoBujCPGj-I_YbcoEEty4Va3jg-N-thM_AHZ1FMqkFSD62VB5XGr8pVScecqE6FwYaihC55C3d1Hnnl_1wt4EfNS1MJRWWWNiCdR6kNEZ-aGPURy6834aHg-fPZoaRber9QiNSi2uzfsbhmwvR5dnuL97vt8-7552PDdVwMtQ3QoPVZTqVhKu0RTLJNapDQMbGJNaq3kkY2V4S6U2kxieGBvKMMh4UyGq01seINtJmOYBrqbC9PbFF_AHQTltuYkhjRdFac_V6LhKPU65ST4lyyG8ePy7HRw7tz_uY0sz116AOeefspNKoRZhwuRLMF_PfmAOCpahfzV4yAs2crl0uKQY0Zk9Q7AfVcUSjCykZvgwHBSUlIRENT3TpjAVlcw1w6BcPzg-ZeHwO3uS1DmivwLd_xDvKkzlg9ysATOJr1SaxbHUGPPwSDWNti2tUpOkidVJA1q1BEXmGprTXI1HMW7FTFIXKHVRSl3wBux_rRlW7Tz-pN6sN0a4X_tFjBWxAQf1Zo0__85t_W9uOzDT6d7eiJvLu-sNmKWx9lXazCZMFaNXs4XOT6G2S5VjIP5ZxT8BrykUkQ |
linkToPdf | http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwtV1JT-swEB5BkZ64sCPK6gOceBFt4mwHhNgq1go98aTeLDu2KyRISwlC_DT-HTOJQwUS3LhFyWQO48-z2LMAbOsolEk7QO1nQ-VxqXFLqTjzlAnRuTDUUITOIa-70dl_ftELexPwVtfCUFplrRNLRa0HGZ2R7_kYxaE776fhnnVpETcnnYPho0cTpOimtR6nUUHk0ry-YPj2tH9-gmu94_ud09vjM89NGPAyhF7hIVyphiXhGs2yTGKd2jCwgTGptZpHMlaGt1VqM4mhirGhDIOMtxRqeHrLA2Q7CVMxBUUNmDo67d78-zADQVDOXm5hgONFUdpzFTuubo9TppJPqXOobDz-2SqOXd0vt7Ol0evMwYzzVtlhBa95mDD5AszWkyCYUwyL0L8Y3OUFG7nMOvylGNEJPkP5jarSCUb2UjN8GA4KSlFCopqeaVOYikrmmmGIru8cn7KM-JU9SOoj0V-C298Q8DI08kFuVoCZxFcqzeJYaoyAeKRaRtu2VqlJ0sTqpAntWoIic-3NacrGvRg3ZiapC5S6KKUueBN2P_4ZVs09fqRerxdGuI3-JMawbMLferHGn7_ntvozty34g_AWV-fdyzWYphn3VQ7NOjSK0bPZQE-oUJsOcwzEL6P8HSjNGiM |
openUrl | ctx_ver=Z39.88-2004&ctx_enc=info%3Aofi%2Fenc%3AUTF-8&rfr_id=info%3Asid%2Fsummon.serialssolutions.com&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=article&rft.atitle=Joint+relational+triple+extraction+based+on+potential+relation+detection+and+conditional+entity+mapping&rft.jtitle=Applied+intelligence+%28Dordrecht%2C+Netherlands%29&rft.au=Zhou%2C+Xiong&rft.au=Zhang%2C+Qinghua&rft.au=Gao%2C+Man&rft.au=Wang%2C+Guoyin&rft.date=2023-12-01&rft.pub=Springer+Nature+B.V&rft.issn=0924-669X&rft.eissn=1573-7497&rft.volume=53&rft.issue=24&rft.spage=29656&rft.epage=29676&rft_id=info:doi/10.1007%2Fs10489-023-05111-4&rft.externalDBID=HAS_PDF_LINK |
thumbnail_l | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/lc.gif&issn=0924-669X&client=summon |
thumbnail_m | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/mc.gif&issn=0924-669X&client=summon |
thumbnail_s | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/sc.gif&issn=0924-669X&client=summon |