DG‐based SPO tuple recognition using self‐attention M‐Bi‐LSTM
This study proposes a dependency grammar‐based self‐attention multilayered bidirectional long short‐term memory (DG‐M‐Bi‐LSTM) model for subject–predicate–object (SPO) tuple recognition from natural language (NL) sentences. To add recent knowledge to the knowledge base autonomously, it is essential...
Saved in:
Published in | ETRI journal Vol. 44; no. 3; pp. 438 - 449 |
---|---|
Main Author | |
Format | Journal Article |
Language | English |
Published |
Electronics and Telecommunications Research Institute (ETRI)
01.06.2022
한국전자통신연구원 |
Subjects | |
Online Access | Get full text |
Cover
Loading…
Abstract | This study proposes a dependency grammar‐based self‐attention multilayered bidirectional long short‐term memory (DG‐M‐Bi‐LSTM) model for subject–predicate–object (SPO) tuple recognition from natural language (NL) sentences. To add recent knowledge to the knowledge base autonomously, it is essential to extract knowledge from numerous NL data. Therefore, this study proposes a high‐accuracy SPO tuple recognition model that requires a small amount of learning data to extract knowledge from NL sentences. The accuracy of SPO tuple recognition using DG‐M‐Bi‐LSTM is compared with that using NL‐based self‐attention multilayered bidirectional LSTM, DG‐based bidirectional encoder representations from transformers (BERT), and NL‐based BERT to evaluate its effectiveness. The DG‐M‐Bi‐LSTM model achieves the best results in terms of recognition accuracy for extracting SPO tuples from NL sentences even if it has fewer deep neural network (DNN) parameters than BERT. In particular, its accuracy is better than that of BERT when the learning data are limited. Additionally, its pretrained DNN parameters can be applied to other domains because it learns the structural relations in NL sentences. |
---|---|
AbstractList | This study proposes a dependency grammar-based self-attention multilayered bidirectional long short-term memory (DG-M-Bi-LSTM) model for subject–predicate–object (SPO) tuple recognition from natural language (NL) sentences. To add recent knowledge to the knowledge base autonomously, it is essential to extract knowledge from numerous NL data. Therefore, this study proposes a high-accuracy SPO tuple recognition model that requires a small amount of learning data to extract knowledge from NL sentences. The accuracy of SPO tuple recognition using DG-M-Bi-LSTM is compared with that using NL-based self-attention multilayered bidirectional LSTM, DG-based bidirectional encoder representations from transformers (BERT), and NL-based BERT to evaluate its effectiveness. The DG-M-Bi-LSTM model achieves the best results in terms of recognition accuracy for extracting SPO tuples from NL sentences even if it has fewer deep neural network (DNN) parameters than BERT. In particular, its accuracy is better than that of BERT when the learning data are limited. Additionally, its pretrained DNN parameters can be applied to other domains because it learns the structural relations in NL sentences. KCI Citation Count: 0 This study proposes a dependency grammar‐based self‐attention multilayered bidirectional long short‐term memory (DG‐M‐Bi‐LSTM) model for subject–predicate–object (SPO) tuple recognition from natural language (NL) sentences. To add recent knowledge to the knowledge base autonomously, it is essential to extract knowledge from numerous NL data. Therefore, this study proposes a high‐accuracy SPO tuple recognition model that requires a small amount of learning data to extract knowledge from NL sentences. The accuracy of SPO tuple recognition using DG‐M‐Bi‐LSTM is compared with that using NL‐based self‐attention multilayered bidirectional LSTM, DG‐based bidirectional encoder representations from transformers (BERT), and NL‐based BERT to evaluate its effectiveness. The DG‐M‐Bi‐LSTM model achieves the best results in terms of recognition accuracy for extracting SPO tuples from NL sentences even if it has fewer deep neural network (DNN) parameters than BERT. In particular, its accuracy is better than that of BERT when the learning data are limited. Additionally, its pretrained DNN parameters can be applied to other domains because it learns the structural relations in NL sentences. |
Author | Jung, Joon‐young |
Author_xml | – sequence: 1 givenname: Joon‐young orcidid: 0000-0001-6964-4005 surname: Jung fullname: Jung, Joon‐young email: jyjung21@etri.re.kr organization: Electronics and Telecommunications Research Institute |
BackLink | https://www.kci.go.kr/kciportal/ci/sereArticleSearch/ciSereArtiView.kci?sereArticleSearchBean.artiId=ART002848612$$DAccess content in National Research Foundation of Korea (NRF) |
BookMark | eNqFkU9LwzAYh4NMcFPPXnv10C3_mxynTh1MJjrPIUmTkVlbSTtkNz-Cn9FPYtvpRRAvecnL8_wI-Y3AoKxKB8AZgmOKkZi4JobNGEMMU0g5PABDjAlJM4L5AAwRxizllJMjMKrrDWwxysQQzK5uPt8_jK5dnjzeL5Nm-1q4JDpbrcvQhKpMtnUo10ntCt-Cumlc2a_v2ttFaI_F4-ruBBx6XdTu9Hseg6fr2eryNl0sb-aX00VqKZYwJZnPqeVEE4kyYbkzhgkvmMHMcyYFQ554nAmJDRfUGu6lZDhDKHNW51CSY3C-zy2jV882qEqHfq4r9RzV9GE1V1IKKjls2fmezSu9Ua8xvOi464V-UcW10rEJtnDKUJcTa6nOGKOWCN1lCMMQ8V4bwtosts-ysarr6LyyodHdPzRRh0IhqLoOVN-B6jpQXQetN_nl_bzjb4PvjbdQuN1_uJqtHjDCFEPyBXfKnwM |
CitedBy_id | crossref_primary_10_4218_etrij_2023_0308 |
Cites_doi | 10.4218/etrij.16.0115.0542 10.18653/v1/D17-1278 10.18653/v1/P19-1023 10.1016/j.neunet.2018.08.016 10.1145/3159652.3159712 10.18653/v1/P18-2065 10.1109/ACCESS.2020.2976744 10.1609/aaai.v33i01.3301297 10.18653/v1/K18-1050 10.3115/v1/D14-1082 10.18653/v1/D15-1203 10.4218/etrij.2018-0553 10.18653/v1/D19-6108 10.18653/v1/P19-1523 10.1145/2872518.2889386 10.1109/72.279181 10.1049/cp:19991218 10.1109/IJCNN.2000.861302 10.1162/neco.1997.9.8.1735 10.1109/ACCESS.2019.2963045 10.18653/v1/D15-1044 10.3115/v1/D14-1162 10.18653/v1/D16-1244 10.1007/978-3-642-15939-8_10 10.3115/1690219.1690287 10.1016/j.neucom.2017.09.080 10.1016/j.neunet.2005.06.042 10.18653/v1/D16-1053 10.3115/1075096.1075150 10.1109/78.650093 10.1145/2629489 10.14778/1453856.1453916 10.1145/1242572.1242667 10.1109/ASRU.2013.6707742 10.18653/v1/P16-1200 10.1016/j.neucom.2019.01.078 10.1145/3308558.3313629 10.18653/v1/P18-1151 10.1007/978-3-540-76298-0_52 |
ContentType | Journal Article |
Copyright | 1225‐6463/$ © 2021 ETRI |
Copyright_xml | – notice: 1225‐6463/$ © 2021 ETRI |
DBID | AAYXX CITATION DOA ACYCR |
DOI | 10.4218/etrij.2020-0460 |
DatabaseName | CrossRef DOAJ Directory of Open Access Journals Korean Citation Index |
DatabaseTitle | CrossRef |
DatabaseTitleList | |
Database_xml | – sequence: 1 dbid: DOA name: DOAJ Directory of Open Access Journals url: https://www.doaj.org/ sourceTypes: Open Website |
DeliveryMethod | fulltext_linktorsrc |
Discipline | Engineering |
EISSN | 2233-7326 |
EndPage | 449 |
ExternalDocumentID | oai_kci_go_kr_ARTI_9984960 oai_doaj_org_article_b4ed3cc4a7554c38a99848b513ffab35 10_4218_etrij_2020_0460 ETR212420 |
Genre | article |
GrantInformation_xml | – fundername: Electronics and Telecommunications Research Institute funderid: 21ZS1100 |
GroupedDBID | -~X .4S .DC .UV 0R~ 1OC 29G 2WC 5GY 5VS 9ZL AAKPC AAYBS ACGFS ACXQS ACYCR ADBBV ADDVE AENEX ALMA_UNASSIGNED_HOLDINGS ARCSS AVUZU BCNDV DU5 E3Z EBS EDO EJD GROUPED_DOAJ IPNFZ ITG ITH JDI KQ8 KVFHK MK~ ML~ O9- OK1 P5Y RIG RNS TR2 TUS WIN XSB AAYXX ADMLS CITATION OVT AAMMB AEFGJ AGXDD AIDQK AIDYY |
ID | FETCH-LOGICAL-c4290-37fd4c63a39178c6ebb58f85b25f659851f3f27892b684cb6f99527117ecad093 |
IEDL.DBID | DOA |
ISSN | 1225-6463 |
IngestDate | Sun Mar 09 07:51:04 EDT 2025 Wed Aug 27 01:26:42 EDT 2025 Thu Apr 24 23:03:17 EDT 2025 Tue Jul 01 02:03:20 EDT 2025 Wed Jan 22 16:22:50 EST 2025 |
IsDoiOpenAccess | true |
IsOpenAccess | true |
IsPeerReviewed | true |
IsScholarly | true |
Issue | 3 |
Language | English |
License | http://doi.wiley.com/10.1002/tdm_license_1.1 http://onlinelibrary.wiley.com/termsAndConditions#vor |
LinkModel | DirectLink |
MergedId | FETCHMERGED-LOGICAL-c4290-37fd4c63a39178c6ebb58f85b25f659851f3f27892b684cb6f99527117ecad093 |
Notes | Funding information Electronics and Telecommunications Research Institute, Grant/Award Number: 21ZS1100 https://doi.org/10.4218/etrij.2020-0460 |
ORCID | 0000-0001-6964-4005 |
OpenAccessLink | https://doaj.org/article/b4ed3cc4a7554c38a99848b513ffab35 |
PageCount | 12 |
ParticipantIDs | nrf_kci_oai_kci_go_kr_ARTI_9984960 doaj_primary_oai_doaj_org_article_b4ed3cc4a7554c38a99848b513ffab35 crossref_citationtrail_10_4218_etrij_2020_0460 crossref_primary_10_4218_etrij_2020_0460 wiley_primary_10_4218_etrij_2020_0460_ETR212420 |
ProviderPackageCode | CITATION AAYXX |
PublicationCentury | 2000 |
PublicationDate | June 2022 2022-06-00 2022-06-01 2022-06 |
PublicationDateYYYYMMDD | 2022-06-01 |
PublicationDate_xml | – month: 06 year: 2022 text: June 2022 |
PublicationDecade | 2020 |
PublicationTitle | ETRI journal |
PublicationYear | 2022 |
Publisher | Electronics and Telecommunications Research Institute (ETRI) 한국전자통신연구원 |
Publisher_xml | – name: Electronics and Telecommunications Research Institute (ETRI) – name: 한국전자통신연구원 |
References | 2019; 8 2020; 8 34 2018; 275 2020; 42 2018; 108 1997; 45 2019 2008 2007 2018 2014; 57 2016 2019; 337 2016; 38 2005; 18 1994; 5 1997; 9 e_1_2_7_5_1 e_1_2_7_3_1 e_1_2_7_9_1 e_1_2_7_7_1 e_1_2_7_19_1 e_1_2_7_17_1 e_1_2_7_15_1 e_1_2_7_41_1 e_1_2_7_13_1 e_1_2_7_43_1 e_1_2_7_11_1 e_1_2_7_45_1 e_1_2_7_47_1 e_1_2_7_26_1 e_1_2_7_49_1 e_1_2_7_28_1 e_1_2_7_50_1 e_1_2_7_25_1 e_1_2_7_31_1 e_1_2_7_52_1 e_1_2_7_23_1 e_1_2_7_33_1 e_1_2_7_54_1 e_1_2_7_21_1 e_1_2_7_35_1 e_1_2_7_56_1 e_1_2_7_37_1 e_1_2_7_58_1 e_1_2_7_39_1 e_1_2_7_6_1 e_1_2_7_4_1 e_1_2_7_8_1 e_1_2_7_18_1 e_1_2_7_16_1 e_1_2_7_40_1 e_1_2_7_2_1 e_1_2_7_14_1 e_1_2_7_42_1 e_1_2_7_12_1 e_1_2_7_44_1 e_1_2_7_10_1 e_1_2_7_46_1 e_1_2_7_48_1 e_1_2_7_27_1 e_1_2_7_29_1 e_1_2_7_51_1 e_1_2_7_30_1 e_1_2_7_53_1 e_1_2_7_24_1 e_1_2_7_32_1 e_1_2_7_55_1 e_1_2_7_22_1 e_1_2_7_34_1 e_1_2_7_57_1 e_1_2_7_20_1 e_1_2_7_36_1 e_1_2_7_59_1 e_1_2_7_38_1 |
References_xml | – start-page: 722 year: 2007 end-page: 735 – start-page: 519 year: 2018 end-page: 529 – start-page: 5295 year: 2019 end-page: 5300 – start-page: 1627 year: 2018 end-page: 1637 – start-page: 697 year: 2007 end-page: 706 – volume: 18 start-page: 602 issue: 5–6 year: 2005 end-page: 610 article-title: Framewise phoneme classification with bidirectional LSTM and other neural network architectures publication-title: Neural Netw. – start-page: 538 year: 2008 end-page: 549 – volume: 8 start-page: 123369 year: 2019 end-page: 123380 article-title: Short‐term prediction of residential power energy consumption via CNN and multilayer bi‐directional LSTM Networks publication-title: IEEE Access – volume: 38 start-page: 703 issue: 4 year: 2016 end-page: 713 article-title: Sub‐word based offline handwritten farsi word recognition using recurrent neural network publication-title: ETRI J. – volume: 337 start-page: 325 year: 2019 end-page: 338 article-title: Bidirectional LSTM with attention mechanism and convolutional layer for text classification publication-title: Neurocomputing – start-page: 297 year: 2019 end-page: 304 – volume: 45 start-page: 2673 issue: 11 year: 1997 end-page: 2681 article-title: Bidirectional recurrent neural networks publication-title: IEEE Trans. Signal Process. – volume: 42 start-page: 90 issue: 1 year: 2020 end-page: 100 article-title: Deep recurrent neural networks with word embeddings for Urdu named entity recognition publication-title: ETRI J. – volume: 34 – volume: 8 start-page: 42689 year: 2020 end-page: 42707 article-title: Document‐level text classification using single‐layer multisize filters convolutional neural network publication-title: IEEE Access – start-page: 4171 year: 2019 end-page: 4186 – volume: 9 start-page: 1735 issue: 8 year: 1997 end-page: 1780 article-title: Long short‐term memory publication-title: Neural Comput. – volume: 57 start-page: 78 issue: 10 year: 2014 end-page: 85 article-title: Wikidata: A free collaborative knowledgebase publication-title: Commun. ACM – volume: 5 start-page: 157 issue: 2 year: 1994 end-page: 166 article-title: Learning long‐term dependencies with gradient descent is difficult publication-title: IEEE Trans. Neural Netw. – volume: 275 start-page: 1407 year: 2018 end-page: 1415 article-title: Textual sentiment analysis via three different attention convolutional neural networks and cross‐modality consistent regression publication-title: Neurocomputing – start-page: 75 year: 2016 end-page: 76 – volume: 108 start-page: 240 year: 2018 end-page: 247 article-title: Distant supervision for relation extraction with hierarchical selective attention publication-title: Neural Netw. – start-page: 229 year: 2019 end-page: 240 – ident: e_1_2_7_34_1 – ident: e_1_2_7_41_1 doi: 10.4218/etrij.16.0115.0542 – ident: e_1_2_7_20_1 – ident: e_1_2_7_21_1 doi: 10.18653/v1/D17-1278 – ident: e_1_2_7_28_1 doi: 10.18653/v1/P19-1023 – ident: e_1_2_7_51_1 – ident: e_1_2_7_16_1 doi: 10.1016/j.neunet.2018.08.016 – ident: e_1_2_7_25_1 doi: 10.1145/3159652.3159712 – ident: e_1_2_7_24_1 doi: 10.18653/v1/P18-2065 – ident: e_1_2_7_56_1 doi: 10.1109/ACCESS.2020.2976744 – ident: e_1_2_7_6_1 doi: 10.1609/aaai.v33i01.3301297 – ident: e_1_2_7_5_1 doi: 10.18653/v1/K18-1050 – ident: e_1_2_7_17_1 – ident: e_1_2_7_33_1 – ident: e_1_2_7_36_1 doi: 10.3115/v1/D14-1082 – ident: e_1_2_7_14_1 doi: 10.18653/v1/D15-1203 – ident: e_1_2_7_42_1 doi: 10.4218/etrij.2018-0553 – ident: e_1_2_7_30_1 doi: 10.18653/v1/D19-6108 – ident: e_1_2_7_27_1 doi: 10.18653/v1/P19-1523 – ident: e_1_2_7_52_1 – ident: e_1_2_7_9_1 doi: 10.1145/2872518.2889386 – ident: e_1_2_7_43_1 doi: 10.1109/72.279181 – ident: e_1_2_7_45_1 doi: 10.1049/cp:19991218 – ident: e_1_2_7_46_1 doi: 10.1109/IJCNN.2000.861302 – ident: e_1_2_7_13_1 – ident: e_1_2_7_38_1 – ident: e_1_2_7_32_1 – ident: e_1_2_7_19_1 – ident: e_1_2_7_44_1 doi: 10.1162/neco.1997.9.8.1735 – ident: e_1_2_7_50_1 doi: 10.1109/ACCESS.2019.2963045 – ident: e_1_2_7_53_1 doi: 10.18653/v1/D15-1044 – ident: e_1_2_7_39_1 doi: 10.3115/v1/D14-1162 – ident: e_1_2_7_59_1 doi: 10.18653/v1/D16-1244 – ident: e_1_2_7_12_1 doi: 10.1007/978-3-642-15939-8_10 – ident: e_1_2_7_57_1 – ident: e_1_2_7_11_1 doi: 10.3115/1690219.1690287 – ident: e_1_2_7_54_1 doi: 10.1016/j.neucom.2017.09.080 – ident: e_1_2_7_37_1 – ident: e_1_2_7_47_1 doi: 10.1016/j.neunet.2005.06.042 – ident: e_1_2_7_29_1 – ident: e_1_2_7_40_1 – ident: e_1_2_7_58_1 doi: 10.18653/v1/D16-1053 – ident: e_1_2_7_35_1 doi: 10.3115/1075096.1075150 – ident: e_1_2_7_22_1 – ident: e_1_2_7_48_1 doi: 10.1109/78.650093 – ident: e_1_2_7_3_1 doi: 10.1145/2629489 – ident: e_1_2_7_8_1 doi: 10.14778/1453856.1453916 – ident: e_1_2_7_4_1 doi: 10.1145/1242572.1242667 – ident: e_1_2_7_49_1 doi: 10.1109/ASRU.2013.6707742 – ident: e_1_2_7_15_1 doi: 10.18653/v1/P16-1200 – ident: e_1_2_7_55_1 doi: 10.1016/j.neucom.2019.01.078 – ident: e_1_2_7_10_1 doi: 10.1145/3308558.3313629 – ident: e_1_2_7_18_1 – ident: e_1_2_7_7_1 doi: 10.18653/v1/P18-1151 – ident: e_1_2_7_26_1 – ident: e_1_2_7_2_1 doi: 10.1007/978-3-540-76298-0_52 – ident: e_1_2_7_23_1 – ident: e_1_2_7_31_1 |
SSID | ssj0020458 |
Score | 2.2895465 |
Snippet | This study proposes a dependency grammar‐based self‐attention multilayered bidirectional long short‐term memory (DG‐M‐Bi‐LSTM) model for... This study proposes a dependency grammar-based self-attention multilayered bidirectional long short-term memory (DG-M-Bi-LSTM) model for... |
SourceID | nrf doaj crossref wiley |
SourceType | Open Website Enrichment Source Index Database Publisher |
StartPage | 438 |
SubjectTerms | dependency grammar information extraction long short-term memory SPO tuple 전자/정보통신공학 |
Title | DG‐based SPO tuple recognition using self‐attention M‐Bi‐LSTM |
URI | https://onlinelibrary.wiley.com/doi/abs/10.4218%2Fetrij.2020-0460 https://doaj.org/article/b4ed3cc4a7554c38a99848b513ffab35 https://www.kci.go.kr/kciportal/ci/sereArticleSearch/ciSereArtiView.kci?sereArticleSearchBean.artiId=ART002848612 |
Volume | 44 |
hasFullText | 1 |
inHoldings | 1 |
isFullTextHit | |
isPrint | |
ispartofPNX | ETRI Journal, 2022, 44(3), , pp.438-449 |
link | http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwrV1NT9wwELUqTvSACm3V5UtRxaEXs4m_Eh848E2rbqnKInGzPE6MgFUWLcv_Z8YJKzggLj1FimzFemPlzUvGbxjbqX3IPdIct6VFgQI1cJ-D4CHaIAD3gJd0UHj0x5xdql9X-upFqy-qCevsgTvghqCaWoagfInEF2TlUR-oCnQhY_Qgk3spct6zmOqlFv3-I6mFu5UbZWRn6qOQz4bUqOoWhaHoyhpf8VGy7UeWaWfxdbKa2ObkE1vp08Rsv1veKvvQtGvs4wvzwM9s7-iUEwfV2cXf82z-eD9pskU50LTNqKL9OntoJpGTh2aqasxG_OCG_74Yj76wy5Pj8eEZ75sh8ICUge_KMtYqGOklCqwqmAZAV7HSIHQ02mLiFGWkY60CTKUCmGitFmVRlE3wdW7lV7bUTtvmG8sgB_C6gsKQHaHVoIoiKlCm1iJUMh-w3WdIXOidwqlhxcShYiAMXcLQEYaOMBywH4sJ951JxttDDwjjxTByt043MOauj7l7L-YD9h0j5O7CTZpP1-upu5s51AA_HQ229KRhCuB7K3LH43_I30rk6_9jbRtsWdAJifShZpMtzWePzRbmLXPYTlv0CXaP5QI |
linkProvider | Directory of Open Access Journals |
linkToHtml | http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwlV3NbtQwELZgewAOqPyJLRQixIFLaBL_xD62sGULuwXRXVRxsTxOvNp2la3C9s4j8Iw8CTNJiFokhLgkSjROohmPvxnH85mxl4XziUOYi01uMEGBAmKXQBb7YHwG2Accp0Lh6bEaz8X7U3l6pRam5YfoJ9zIM5rxmhycJqTJywXCEllxUy_PMMPL2vWJN9kWxTZ6wLb2v8y_zvu0i34FUtqFPTdWQvGW4IcesvfHI65hU0Phj4hT1eF64Nogz-E2u9uFjNF-a-N77EZZ3Wd3rhAJPmCjt-9-fv9BiFREJ58-RpvLi1UZ9YuD1lVE69sX0bdyFVCQODWbVY7RFK8OlniYnMymD9n8cDR7M467DRJijzCC42ceCuEVdxyTLu1VCSB10BIyGZQ0GEwFHqjUNQOlhQcVjJFZnqZ56V2RGP6IDap1VT5mESQATmpIFVEUGgkiTYMAoQqZec2TIXv9WzXWd-zhtInFymIWQbq0jS4t6dKSLofsVd_goiXO-LvoAem6FyPG6-bGul7YzoEsiLLg3guXYwDkuXaYJwoNMuUhOOByyF6gpey5Xzbt6bxY2_PaYl5wZEnY0Jv2GkP-64vsaPYZMV1kyc5_t3jObo1n04mdHB1_eMJuZ1Qu0czaPGWDTX1Z7mIQs4FnXS_9BVSd7Bk |
linkToPdf | http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwlV1Lb9NAEF5BKyE4VDzV8LQQBy6m9r5sH_tIaKEpFU0Q4rLaWXuj0MiJ3PTOT-A38ks6s3YtioQQF1u2Zm1rHjsz65lvGXtTWpdYdHNxkRWYoEAJsU2Ax84XjgPqgBXUKDw-0YdT-eGruq4mpF6YFh-iX3AjywjzNRn4qvRk5BK9Eglx3cy_Y4LH2_LE22yTsPJQsTd3v0y_Tfusi_4EUtaFihtrqUWL70MP2fnjETdcU0DwR4dTN_5m3Bocz-g-2-oixmi3FfEDdquqH7J7v-EIPmLDg_e_fvwkh1RGZ6efovXlalFFfW3Qso6ovH0WXVQLj4QEqRmKHKMxXu3N8XB8Nhk_ZtPRcLJ_GHf7I8QOvQhOn5kvpdPCCsy5cqcrAJX7XAFXXqsCYykvPHW6ctC5dKB9USiepWlWOVsmhXjCNuplXW2zCBIAq3JINSEUFgpkmnoJUpeKu1wkA_bumjXGdeDhtIfFwmASQbw0gZeGeGmIlwP2th-wanEz_k66R7zuyQjwOtxYNjPT2Y8BWZXCOWkzjH-cyC2miTIHlQrvLQg1YK9RUubczcN4Os-W5rwxmBYcGSIu6E07QZD_-iIznHxGly558vS_R7xid04PRub46OTjM3aXU7NEWLN5zjbWzWX1AkOYNbzslPQK7MvrQg |
openUrl | ctx_ver=Z39.88-2004&ctx_enc=info%3Aofi%2Fenc%3AUTF-8&rfr_id=info%3Asid%2Fsummon.serialssolutions.com&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=article&rft.atitle=DG%E2%80%90based+SPO+tuple+recognition+using+self%E2%80%90attention+M%E2%80%90Bi%E2%80%90LSTM&rft.jtitle=ETRI+journal&rft.au=Jung%2C+Joon%E2%80%90young&rft.date=2022-06-01&rft.issn=1225-6463&rft.eissn=2233-7326&rft.volume=44&rft.issue=3&rft.spage=438&rft.epage=449&rft_id=info:doi/10.4218%2Fetrij.2020-0460&rft.externalDBID=n%2Fa&rft.externalDocID=10_4218_etrij_2020_0460 |
thumbnail_l | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/lc.gif&issn=1225-6463&client=summon |
thumbnail_m | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/mc.gif&issn=1225-6463&client=summon |
thumbnail_s | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/sc.gif&issn=1225-6463&client=summon |