SwitchNet: A modular neural network for adaptive relation extraction

This paper presents a portable toolkit, SwitchNet, for extracting relations from textual input. We summarize four data protocols for relation extraction tasks, including relation classification, relation extraction, triple extraction, and distant supervision relation extraction. This neural architec...

Full description

Saved in:
Bibliographic Details
Published inComputers & electrical engineering Vol. 104; no. B; p. 108445
Main Authors Zhu, Hongyin, Tiwari, Prayag, Zhang, Yazhou, Gupta, Deepak, Alharbi, Meshal, Nguyen, Tri Gia, Dehdashti, Shahram
Format Journal Article
LanguageEnglish
Published Elsevier Ltd 01.12.2022
Subjects
Online AccessGet full text
ISSN0045-7906
1879-0755
1879-0755
DOI10.1016/j.compeleceng.2022.108445

Cover

Abstract This paper presents a portable toolkit, SwitchNet, for extracting relations from textual input. We summarize four data protocols for relation extraction tasks, including relation classification, relation extraction, triple extraction, and distant supervision relation extraction. This neural architecture is modular, so it can take as input data at different stages of the information extraction process (simple text, text and entities or entity pairs as relation candidates) and compute the rest of the process (named entity recognition and relation classification). We systematically design four information flows to integrate the above protocols by sharing network building blocks and switching different information flows. This framework can extract multiple triples (subject, predicate, object) in one pass. This framework enhances the use of relation classification models in end-to-end triple extraction by inferring pairs of entities of interest and using the shared representation mechanism. [Display omitted] •Dividing the information extraction process into modular neural networks.•4 information flows for integrating 4 relation extraction data protocols.•Integrating NER and RE subtasks through the POEOI inference.•Performance improvements for 4 relation extraction tasks.
AbstractList This paper presents a portable toolkit, SwitchNet, for extracting relations from textual input. We summarize four data protocols for relation extraction tasks, including relation classification, relation extraction, triple extraction, and distant supervision relation extraction. This neural architecture is modular, so it can take as input data at different stages of the information extraction process (simple text, text and entities or entity pairs as relation candidates) and compute the rest of the process (named entity recognition and relation classification). We systematically design four information flows to integrate the above protocols by sharing network building blocks and switching different information flows. This framework can extract multiple triples (subject, predicate, object) in one pass. This framework enhances the use of relation classification models in end-to-end triple extraction by inferring pairs of entities of interest and using the shared representation mechanism. © 2022 The Author(s)
This paper presents a portable toolkit, SwitchNet, for extracting relations from textual input. We summarize four data protocols for relation extraction tasks, including relation classification, relation extraction, triple extraction, and distant supervision relation extraction. This neural architecture is modular, so it can take as input data at different stages of the information extraction process (simple text, text and entities or entity pairs as relation candidates) and compute the rest of the process (named entity recognition and relation classification). We systematically design four information flows to integrate the above protocols by sharing network building blocks and switching different information flows. This framework can extract multiple triples (subject, predicate, object) in one pass. This framework enhances the use of relation classification models in end-to-end triple extraction by inferring pairs of entities of interest and using the shared representation mechanism. [Display omitted] •Dividing the information extraction process into modular neural networks.•4 information flows for integrating 4 relation extraction data protocols.•Integrating NER and RE subtasks through the POEOI inference.•Performance improvements for 4 relation extraction tasks.
ArticleNumber 108445
Author Tiwari, Prayag
Gupta, Deepak
Zhang, Yazhou
Dehdashti, Shahram
Zhu, Hongyin
Alharbi, Meshal
Nguyen, Tri Gia
Author_xml – sequence: 1
  givenname: Hongyin
  orcidid: 0000-0002-4223-5209
  surname: Zhu
  fullname: Zhu, Hongyin
  email: zhuhongyin@inspur.com
  organization: Inspur Electronic Information Industry Co., Ltd., Jinan 250101, China
– sequence: 2
  givenname: Prayag
  surname: Tiwari
  fullname: Tiwari, Prayag
  email: prayag.tiwari@ieee.org
  organization: School of Information Technology, Halmstad University, Sweden
– sequence: 3
  givenname: Yazhou
  surname: Zhang
  fullname: Zhang, Yazhou
  email: yzzhang@zzuli.edu.cn
  organization: Software Engineering College, Zhengzhou University of Light Industry, Zhengzhou 450002, China
– sequence: 4
  givenname: Deepak
  surname: Gupta
  fullname: Gupta, Deepak
  email: deepakgupta@mait.ac.in
  organization: Maharaja Agrasen Institute of Technology, Delhi, India
– sequence: 5
  givenname: Meshal
  surname: Alharbi
  fullname: Alharbi, Meshal
  email: Mg.alharbi@psau.edu.sa
  organization: Department of Computer Science, College of Computer Engineering and Sciences, Prince Sattam Bin Abdulaziz University, P.O. Box 151, Alkharj 11942, Saudi Arabia
– sequence: 6
  givenname: Tri Gia
  surname: Nguyen
  fullname: Nguyen, Tri Gia
  email: tri@ieee.org
  organization: FPT University, Danang 50509, Viet Nam
– sequence: 7
  givenname: Shahram
  surname: Dehdashti
  fullname: Dehdashti, Shahram
  email: shahram.dehdashti@gmail.com
  organization: School of Information Systems, Queensland University of Technology, Brisbane 4000, Australia
BackLink https://urn.kb.se/resolve?urn=urn:nbn:se:hh:diva-48910$$DView record from Swedish Publication Index
BookMark eNqNkMtOwzAQRS1UJNrCP4Q1SrGdxHbYoKrlJVWw4LG1XHvSuqRx5Lgt_D0pAQmx6urOjObeGZ0B6lWuAoTOCR4RTNjlaqTduoYSNFSLEcWUtnORptkR6hPB8xjzLOuhPsZpFvMcsxM0aJoVbntGRB9Nn3c26OUjhKtoHK2d2ZTKRxVsvCpbCTvn36PC-UgZVQe7hchDqYJ1VQQfwSu9L0_RcaHKBs5-dIheb29eJvfx7OnuYTKexTrFIsQq5VyTudAFS2lBWJ5xmmFsRDLXNM2YIZpQEIoyRsDkrEiSTFOTQ8GF4VwkQ3TR5TY7qDdzWXu7Vv5TOmXl1L6NpfMLuVzKVOQEt9vX3bb2rmk8FFLb8P16-7ctJcFyj1Cu5B-Eco9QdgjbhPxfwu_FQ7yTzgstkK0FLxttodJgrAcdpHH2gJQvDz2V5w
CitedBy_id crossref_primary_10_1016_j_eswa_2024_125976
crossref_primary_10_1016_j_eswa_2023_122723
crossref_primary_10_1007_s10489_024_05842_y
crossref_primary_10_1016_j_asoc_2023_110075
crossref_primary_10_1016_j_compeleceng_2025_110237
Cites_doi 10.1109/CVPR.2014.81
10.18653/v1/P17-1036
10.1162/106454699568728
10.1016/j.neuron.2016.10.050
10.1016/j.eswa.2018.07.032
10.18653/v1/P16-1200
ContentType Journal Article
Copyright 2022 The Author(s)
Copyright_xml – notice: 2022 The Author(s)
DBID 6I.
AAFTH
AAYXX
CITATION
AAXBQ
ADTPV
AOWAS
D8T
D8Z
ZZAVC
DOI 10.1016/j.compeleceng.2022.108445
DatabaseName ScienceDirect Open Access Titles
Elsevier:ScienceDirect:Open Access
CrossRef
SWEPUB Högskolan i Halmstad full text
SwePub
SwePub Articles
SWEPUB Freely available online
SWEPUB Högskolan i Halmstad
SwePub Articles full text
DatabaseTitle CrossRef
DatabaseTitleList

DeliveryMethod fulltext_linktorsrc
Discipline Engineering
EISSN 1879-0755
ExternalDocumentID oai_DiVA_org_hh_48910
10_1016_j_compeleceng_2022_108445
S0045790622006607
GroupedDBID --K
--M
.DC
.~1
0R~
1B1
1~.
1~5
29F
4.4
457
4G.
5GY
5VS
6I.
7-5
71M
8P~
9JN
AACTN
AAEDT
AAEDW
AAFTH
AAIAV
AAIKJ
AAKOC
AALRI
AAOAW
AAQFI
AAQXK
AAXUO
AAYFN
ABBOA
ABEFU
ABFNM
ABJNI
ABMAC
ABXDB
ABYKQ
ACDAQ
ACGFO
ACGFS
ACNNM
ACRLP
ACZNC
ADBBV
ADEZE
ADJOM
ADMUD
ADTZH
AEBSH
AECPX
AEKER
AENEX
AFFNX
AFKWA
AFTJW
AGHFR
AGUBO
AGYEJ
AHHHB
AHJVU
AHZHX
AIALX
AIEXJ
AIKHN
AITUG
AJBFU
AJOXV
ALMA_UNASSIGNED_HOLDINGS
AMFUW
AMRAJ
AOUOD
ASPBG
AVWKF
AXJTR
AZFZN
BJAXD
BKOJK
BLXMC
CS3
DU5
EBS
EFJIC
EFLBG
EJD
EO8
EO9
EP2
EP3
FDB
FEDTE
FGOYB
FIRID
FNPLU
FYGXN
G-2
G-Q
GBLVA
GBOLZ
HLZ
HVGLF
HZ~
IHE
J1W
JJJVA
KOM
LG9
LY7
M41
MO0
N9A
O-L
O9-
OAUVE
OZT
P-8
P-9
P2P
PC.
PQQKQ
Q38
R2-
RIG
ROL
RPZ
RXW
SBC
SDF
SDG
SDP
SES
SET
SEW
SPC
SPCBC
SST
SSV
SSZ
T5K
TAE
TN5
UHS
VOH
WH7
WUQ
XPP
ZMT
~G-
~S-
AATTM
AAXKI
AAYWO
AAYXX
ABWVN
ACRPL
ACVFH
ADCNI
ADNMO
AEIPS
AEUPX
AFJKZ
AFPUW
AFXIZ
AGCQF
AGQPQ
AGRNS
AIGII
AIIUN
AKBMS
AKRWK
AKYEP
ANKPU
APXCP
BNPGV
CITATION
SSH
AAXBQ
ADTPV
AOWAS
D8T
D8Z
EFKBS
ZZAVC
ID FETCH-LOGICAL-c408t-a477c1b8cf642f169572500d83bc2456d1c12e8a2661ed96f335c2d9ef78d7783
IEDL.DBID AIKHN
ISSN 0045-7906
1879-0755
IngestDate Tue Sep 09 23:34:38 EDT 2025
Tue Jul 01 01:45:55 EDT 2025
Thu Apr 24 22:57:17 EDT 2025
Fri Feb 23 02:37:31 EST 2024
IsDoiOpenAccess true
IsOpenAccess true
IsPeerReviewed true
IsScholarly true
Issue B
Keywords Relation extraction
Entity pair
Joint optimization
Information flow
Modular neural network
Language English
License This is an open access article under the CC BY license.
LinkModel DirectLink
MergedId FETCHMERGED-LOGICAL-c408t-a477c1b8cf642f169572500d83bc2456d1c12e8a2661ed96f335c2d9ef78d7783
ORCID 0000-0002-4223-5209
OpenAccessLink https://www.sciencedirect.com/science/article/pii/S0045790622006607
ParticipantIDs swepub_primary_oai_DiVA_org_hh_48910
crossref_citationtrail_10_1016_j_compeleceng_2022_108445
crossref_primary_10_1016_j_compeleceng_2022_108445
elsevier_sciencedirect_doi_10_1016_j_compeleceng_2022_108445
ProviderPackageCode CITATION
AAYXX
PublicationCentury 2000
PublicationDate 2022-12-01
PublicationDateYYYYMMDD 2022-12-01
PublicationDate_xml – month: 12
  year: 2022
  text: 2022-12-01
  day: 01
PublicationDecade 2020
PublicationTitle Computers & electrical engineering
PublicationYear 2022
Publisher Elsevier Ltd
Publisher_xml – name: Elsevier Ltd
References Zeng D, Liu K, Lai S, Zhou G, Zhao J. Relation classification via convolutional deep neural network. In: Proceedings of COLING 2014. 2014, p. 2335–44.
Dorigo, Caro, Gambardella (b25) 1999; 5
Girshick R, Donahue J, Darrell T, Malik J. Rich feature hierarchies for accurate object detection and semantic segmentation. In: Proceedings of the IEEE conference on computer vision and pattern recognition. 2014, p. 580–7.
Rotsztejn, Hollenstein, Zhang (b21) 2018
Zhang X, Zhao J, LeCun Y. Character-level convolutional networks for text classification. In: Proceedings of NIPS, vol. 28. 2015, p. 649–57.
Sun, Gong, Wu, Gong, Jiang, Lan (b9) 2019
Zheng, Wang, Bao, Hao, Zhou, Xu (b4) 2017
Banko M, Cafarella MJ, Soderland S, Broadhead M, Etzioni O. Open information extraction from the web. In: Proceedings of IJCAI 2007. p. 2670–6.
Mikolov T, Chen K, Corrado G, Dean J. Efficient Estimation of Word Representations in Vector Space. In: Proceedings of ICLR workshop track. 2013.
Ma, Hovy (b20) 2016
Zhang, Qi, Manning (b8) 2018
Yang, Yang, Dyer, He, Smola, Hovy (b23) 2016
Vafaie H, Imam IF. Feature selection methods: Genetic algorithms vs. greedy-like search. In: Proceedings of the international conference on fuzzy and intelligent control systems, vol. 51. 1994, p. 28.
Bekoulis, Deleu, Demeester, Develder (b11) 2018; 114
Xu, Mou, Li, Chen, Peng, Jin (b6) 2015
He R, Lee WS, Ng HT, Dahlmeier D. An unsupervised neural attention model for aspect extraction. In: Proceedings of the ACL. 2017, p. 388–97.
Mintz, Bills, Snow, Jurafsky (b13) 2009
Wang, Tan, Yu, Chang, Wang, Xu (b10) 2019
Lin Y, Shen S, Liu Z, Luan H, Sun M. Neural relation extraction with selective attention over instances. In: Proceedings of ACL. 2016, p. 2124–33.
Miwa, Bansal (b7) 2016
Devlin, Chang, Lee, Toutanova (b17) 2019
Poo, Du, Ip, Xiong, Xu, Tan (b2) 2016; 92
Han, Gao, Yao, Ye, Liu, Sun (b12) 2019
Vaswani A, Shazeer N, Parmar N, Uszkoreit J, Jones L, Gomez AN, et al. Attention is all you need. In: Proceedings of NIPS. 2017, p. 5998–6008.
Riedel, Yao, McCallum (b14) 2010
10.1016/j.compeleceng.2022.108445_b5
10.1016/j.compeleceng.2022.108445_b3
Mintz (10.1016/j.compeleceng.2022.108445_b13) 2009
10.1016/j.compeleceng.2022.108445_b24
Dorigo (10.1016/j.compeleceng.2022.108445_b25) 1999; 5
10.1016/j.compeleceng.2022.108445_b1
10.1016/j.compeleceng.2022.108445_b22
Rotsztejn (10.1016/j.compeleceng.2022.108445_b21) 2018
Riedel (10.1016/j.compeleceng.2022.108445_b14) 2010
Devlin (10.1016/j.compeleceng.2022.108445_b17) 2019
10.1016/j.compeleceng.2022.108445_b15
Ma (10.1016/j.compeleceng.2022.108445_b20) 2016
Bekoulis (10.1016/j.compeleceng.2022.108445_b11) 2018; 114
Miwa (10.1016/j.compeleceng.2022.108445_b7) 2016
Sun (10.1016/j.compeleceng.2022.108445_b9) 2019
Poo (10.1016/j.compeleceng.2022.108445_b2) 2016; 92
Zheng (10.1016/j.compeleceng.2022.108445_b4) 2017
Xu (10.1016/j.compeleceng.2022.108445_b6) 2015
10.1016/j.compeleceng.2022.108445_b16
10.1016/j.compeleceng.2022.108445_b19
10.1016/j.compeleceng.2022.108445_b18
Yang (10.1016/j.compeleceng.2022.108445_b23) 2016
Wang (10.1016/j.compeleceng.2022.108445_b10) 2019
Han (10.1016/j.compeleceng.2022.108445_b12) 2019
Zhang (10.1016/j.compeleceng.2022.108445_b8) 2018
References_xml – start-page: 1105
  year: 2016
  end-page: 1116
  ident: b7
  article-title: End-to-end relation extraction using LSTMs on sequences and tree structures
  publication-title: Proceedings of ACL
– start-page: 1003
  year: 2009
  end-page: 1011
  ident: b13
  article-title: Distant supervision for relation extraction without labeled data
  publication-title: Proceedings of ACL
– reference: Zhang X, Zhao J, LeCun Y. Character-level convolutional networks for text classification. In: Proceedings of NIPS, vol. 28. 2015, p. 649–57.
– start-page: 1480
  year: 2016
  end-page: 1489
  ident: b23
  article-title: Hierarchical attention networks for document classification
  publication-title: Proceedings of NAACL
– start-page: 2205
  year: 2018
  end-page: 2215
  ident: b8
  article-title: Graph convolution over pruned dependency trees improves relation extraction
  publication-title: Proceedings of EMNLP
– reference: Vaswani A, Shazeer N, Parmar N, Uszkoreit J, Jones L, Gomez AN, et al. Attention is all you need. In: Proceedings of NIPS. 2017, p. 5998–6008.
– reference: Banko M, Cafarella MJ, Soderland S, Broadhead M, Etzioni O. Open information extraction from the web. In: Proceedings of IJCAI 2007. p. 2670–6.
– volume: 5
  start-page: 137
  year: 1999
  end-page: 172
  ident: b25
  article-title: Ant algorithms for discrete optimization
  publication-title: Artif Life
– volume: 114
  start-page: 34
  year: 2018
  end-page: 45
  ident: b11
  article-title: Joint entity recognition and relation extraction as a multi-head selection problem
  publication-title: Expert Syst Appl
– reference: Lin Y, Shen S, Liu Z, Luan H, Sun M. Neural relation extraction with selective attention over instances. In: Proceedings of ACL. 2016, p. 2124–33.
– start-page: 169
  year: 2019
  end-page: 174
  ident: b12
  article-title: OpenNRE: An open and extensible toolkit for neural relation extraction
  publication-title: Proceedings of EMNLP-IJCNLP - system demonstrations
– reference: Girshick R, Donahue J, Darrell T, Malik J. Rich feature hierarchies for accurate object detection and semantic segmentation. In: Proceedings of the IEEE conference on computer vision and pattern recognition. 2014, p. 580–7.
– start-page: 1064
  year: 2016
  end-page: 1074
  ident: b20
  article-title: End-to-end sequence labeling via bi-directional LSTM-CNNs-CRF
  publication-title: Proceedings of ACL
– start-page: 1785
  year: 2015
  end-page: 1794
  ident: b6
  article-title: Classifying relations via long short term memory networks along shortest dependency paths
  publication-title: Proceedings of EMNLP
– start-page: 148
  year: 2010
  end-page: 163
  ident: b14
  article-title: Modeling relations and their mentions without labeled text
  publication-title: Joint European conference on machine learning and knowledge discovery in databases, vol. 6323
– reference: Mikolov T, Chen K, Corrado G, Dean J. Efficient Estimation of Word Representations in Vector Space. In: Proceedings of ICLR workshop track. 2013.
– reference: He R, Lee WS, Ng HT, Dahlmeier D. An unsupervised neural attention model for aspect extraction. In: Proceedings of the ACL. 2017, p. 388–97.
– volume: 92
  start-page: 591
  year: 2016
  end-page: 596
  ident: b2
  article-title: China brain project: Basic neuroscience, brain diseases, and brain-inspired computing
  publication-title: Neuron
– reference: Zeng D, Liu K, Lai S, Zhou G, Zhao J. Relation classification via convolutional deep neural network. In: Proceedings of COLING 2014. 2014, p. 2335–44.
– year: 2018
  ident: b21
  article-title: ETH-DS3lab at SemEval-2018 task 7: Effectively combining recurrent and convolutional neural networks for relation classification and extraction
  publication-title: Proceedings of the international workshop on semantic evaluation
– reference: Vafaie H, Imam IF. Feature selection methods: Genetic algorithms vs. greedy-like search. In: Proceedings of the international conference on fuzzy and intelligent control systems, vol. 51. 1994, p. 28.
– start-page: 1227
  year: 2017
  end-page: 1236
  ident: b4
  article-title: Joint extraction of entities and relations based on a novel tagging scheme
  publication-title: Proceedings of ACL
– start-page: 4171
  year: 2019
  end-page: 4186
  ident: b17
  article-title: BERT: Pre-training of deep bidirectional transformers for language understanding
  publication-title: Proceedings of NAACL-HLT
– start-page: 1361
  year: 2019
  end-page: 1370
  ident: b9
  article-title: Joint type inference on entities and relations via graph convolutional networks
  publication-title: Proceedings of ACL
– start-page: 1371
  year: 2019
  end-page: 1377
  ident: b10
  article-title: Extracting multiple-relations in one-pass with pre-trained transformers
  publication-title: Proceedings of ACL
– ident: 10.1016/j.compeleceng.2022.108445_b19
– start-page: 1064
  year: 2016
  ident: 10.1016/j.compeleceng.2022.108445_b20
  article-title: End-to-end sequence labeling via bi-directional LSTM-CNNs-CRF
– ident: 10.1016/j.compeleceng.2022.108445_b1
  doi: 10.1109/CVPR.2014.81
– start-page: 1371
  year: 2019
  ident: 10.1016/j.compeleceng.2022.108445_b10
  article-title: Extracting multiple-relations in one-pass with pre-trained transformers
– ident: 10.1016/j.compeleceng.2022.108445_b22
  doi: 10.18653/v1/P17-1036
– start-page: 1480
  year: 2016
  ident: 10.1016/j.compeleceng.2022.108445_b23
  article-title: Hierarchical attention networks for document classification
– volume: 5
  start-page: 137
  issue: 2
  year: 1999
  ident: 10.1016/j.compeleceng.2022.108445_b25
  article-title: Ant algorithms for discrete optimization
  publication-title: Artif Life
  doi: 10.1162/106454699568728
– year: 2018
  ident: 10.1016/j.compeleceng.2022.108445_b21
  article-title: ETH-DS3lab at SemEval-2018 task 7: Effectively combining recurrent and convolutional neural networks for relation classification and extraction
– volume: 92
  start-page: 591
  issue: 3
  year: 2016
  ident: 10.1016/j.compeleceng.2022.108445_b2
  article-title: China brain project: Basic neuroscience, brain diseases, and brain-inspired computing
  publication-title: Neuron
  doi: 10.1016/j.neuron.2016.10.050
– ident: 10.1016/j.compeleceng.2022.108445_b18
– volume: 114
  start-page: 34
  year: 2018
  ident: 10.1016/j.compeleceng.2022.108445_b11
  article-title: Joint entity recognition and relation extraction as a multi-head selection problem
  publication-title: Expert Syst Appl
  doi: 10.1016/j.eswa.2018.07.032
– start-page: 4171
  year: 2019
  ident: 10.1016/j.compeleceng.2022.108445_b17
  article-title: BERT: Pre-training of deep bidirectional transformers for language understanding
– ident: 10.1016/j.compeleceng.2022.108445_b16
– ident: 10.1016/j.compeleceng.2022.108445_b24
– start-page: 1785
  year: 2015
  ident: 10.1016/j.compeleceng.2022.108445_b6
  article-title: Classifying relations via long short term memory networks along shortest dependency paths
– start-page: 1105
  year: 2016
  ident: 10.1016/j.compeleceng.2022.108445_b7
  article-title: End-to-end relation extraction using LSTMs on sequences and tree structures
– ident: 10.1016/j.compeleceng.2022.108445_b3
– start-page: 169
  year: 2019
  ident: 10.1016/j.compeleceng.2022.108445_b12
  article-title: OpenNRE: An open and extensible toolkit for neural relation extraction
– ident: 10.1016/j.compeleceng.2022.108445_b15
  doi: 10.18653/v1/P16-1200
– start-page: 1227
  year: 2017
  ident: 10.1016/j.compeleceng.2022.108445_b4
  article-title: Joint extraction of entities and relations based on a novel tagging scheme
– ident: 10.1016/j.compeleceng.2022.108445_b5
– start-page: 2205
  year: 2018
  ident: 10.1016/j.compeleceng.2022.108445_b8
  article-title: Graph convolution over pruned dependency trees improves relation extraction
– start-page: 1361
  year: 2019
  ident: 10.1016/j.compeleceng.2022.108445_b9
  article-title: Joint type inference on entities and relations via graph convolutional networks
– start-page: 148
  year: 2010
  ident: 10.1016/j.compeleceng.2022.108445_b14
  article-title: Modeling relations and their mentions without labeled text
– start-page: 1003
  year: 2009
  ident: 10.1016/j.compeleceng.2022.108445_b13
  article-title: Distant supervision for relation extraction without labeled data
SSID ssj0004618
Score 2.3589776
Snippet This paper presents a portable toolkit, SwitchNet, for extracting relations from textual input. We summarize four data protocols for relation extraction tasks,...
SourceID swepub
crossref
elsevier
SourceType Open Access Repository
Enrichment Source
Index Database
Publisher
StartPage 108445
SubjectTerms Entity pair
Information flow
Joint optimization
Modular neural network
Relation extraction
Title SwitchNet: A modular neural network for adaptive relation extraction
URI https://dx.doi.org/10.1016/j.compeleceng.2022.108445
https://urn.kb.se/resolve?urn=urn:nbn:se:hh:diva-48910
Volume 104
hasFullText 1
inHoldings 1
isFullTextHit
isPrint
link http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwpV1bS8MwFD7MCaIP4hXvRPC1bm3TJBVfhhem4l68sLfQJuk20W6MiW_-dnOadtYHQfCxhdOUk_CdE_J9XwBORKyZbjPt8TDNPBpw5cVGGS_TLE0z5TNVyMXue6z7RG_7Ub8BF5UWBmmVJfY7TC_QunzTKrPZmoxGqPGlEUeb3eL8ABXli0EYs6gJi52bu26vJo_0HSBTdGdssyU4_qZ5IXMbb5wx-cDuFoMASXcUxU2_lKm6n2hRg67XYLVsHknH_d86NEy-ASs1S8FNuHz4GNmJ6JnZGemQt7FGmilB10obmDvON7GNKkl0MkGoI9OSDkcsTE-dzGELnq6vHi-6XnlTgqdoW8y8hHKu_FSozG4nMp_FEbetTVuLMFV4sql95QdGJFiNjY5ZFoaRCnRsMi405yLchmY-zs0OkDBh9ktBGqnUNnO2nCdUa2oSVljj8GwXRJUYqUobcbzN4lVWfLEXWcupxJxKl9NdCOahE-el8Zeg8yr78sfCkBbz_xJ-4mZsPiL6aV-OnjtyPB3I4VBSYTumvf-Nsg_L-ORoLgfQnE3fzaFtVmbpESycfvpH5ZL8AiVL6x0
linkProvider Elsevier
linkToHtml http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwpV1dT8IwFG0IJn48GD8jftaE1wnburYzvhCQoAIvguGt2doOMDoIwfjmb7d3HYgPJiS-brnrctuce5uec4pQmYeKqipVDvPjxCEek06opXYSReM4kS6VmVys06WtPnkcBIMCqi-0MECrzLHfYnqG1vmTSp7NynQ8Bo0vCRjY7GbnB6Ao3yCBz4DXd_PlrogjXQvHBLwZq3QTXf-QvIC3DffN6HRo9oqeB5Q7AtKmP4rUqptoVoGae2g3bx1xzf7dPiro9ADtrBgKHqLG8-fYTENXz29xDb9PFJBMMXhWmsDUMr6xaVNxpKIpAB2e5WQ4bEB6ZkUOR6jfvO_VW05-T4IjSZXPnYgwJt2Yy8RsJhKXhgEzjU1VcT-WcK6pXOl6mkdQi7UKaeL7gfRUqBPGFWPcP0bFdJLqE4T9iJoveXEgY9PKmWIeEaWIjmhmjMOSEuKLxAiZm4jDXRZvYsEWexUrORWQU2FzWkLeMnRqnTTWCbpbZF_8WhbCIP464WU7Y8sRwU27MX6piclsKEYjQbjpl07_N8oV2mr1Om3Rfug-naFteGMJL-eoOJ996AvTtszjy2xZfgMjaevo
openUrl ctx_ver=Z39.88-2004&ctx_enc=info%3Aofi%2Fenc%3AUTF-8&rfr_id=info%3Asid%2Fsummon.serialssolutions.com&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=article&rft.atitle=SwitchNet&rft.jtitle=Computers+%26+electrical+engineering&rft.au=Zhu%2C+Hongyin&rft.au=Tiwari%2C+Prayag&rft.au=Zhang%2C+Yazhou&rft.au=Gupta%2C+Deepak&rft.date=2022-12-01&rft.issn=0045-7906&rft.volume=104&rft.issue=B&rft_id=info:doi/10.1016%2Fj.compeleceng.2022.108445&rft.externalDocID=oai_DiVA_org_hh_48910
thumbnail_l http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/lc.gif&issn=0045-7906&client=summon
thumbnail_m http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/mc.gif&issn=0045-7906&client=summon
thumbnail_s http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/sc.gif&issn=0045-7906&client=summon