融合上下文的残差门卷积实体抽取

基于传统卷积框架的实体抽取方法,由于受到卷积感受野大小的控制,当前词与上下文的关联程度有限,对实体词在整个句子中的语义欠考虑,识别效果不佳.针对这一问题,提出一种基于残差门卷积的实体识别方法,利用膨胀卷积和带残差的门控线性单元,从多个时序维度同步考虑词间的语义关联,借助门控单元调整流向下一层神经元的信息量,缓解跨层传播的梯度消失问题,同时结合注意力机制捕捉词间的相关语义.在公开命名实体识别数据集和专业领域数据集上运行结果表明,与传统的实体抽取框架相比,基于残差门卷积命名实体算法的速度和精度都有较强的竞争优势,体现出算法的优越性和强鲁棒性....

Full description

Saved in:
Bibliographic Details
Published in北京大学学报(自然科学版) Vol. 58; no. 1; pp. 69 - 76
Main Authors 苏丰龙, 孙承哲, 景宁
Format Journal Article
LanguageChinese
Published 国防科技大学电子科学学院, 长沙 410073 20.01.2022
Subjects
Online AccessGet full text
ISSN0479-8023
DOI10.13209/j.0479-8023.2021.102

Cover

Abstract 基于传统卷积框架的实体抽取方法,由于受到卷积感受野大小的控制,当前词与上下文的关联程度有限,对实体词在整个句子中的语义欠考虑,识别效果不佳.针对这一问题,提出一种基于残差门卷积的实体识别方法,利用膨胀卷积和带残差的门控线性单元,从多个时序维度同步考虑词间的语义关联,借助门控单元调整流向下一层神经元的信息量,缓解跨层传播的梯度消失问题,同时结合注意力机制捕捉词间的相关语义.在公开命名实体识别数据集和专业领域数据集上运行结果表明,与传统的实体抽取框架相比,基于残差门卷积命名实体算法的速度和精度都有较强的竞争优势,体现出算法的优越性和强鲁棒性.
AbstractList 基于传统卷积框架的实体抽取方法,由于受到卷积感受野大小的控制,当前词与上下文的关联程度有限,对实体词在整个句子中的语义欠考虑,识别效果不佳.针对这一问题,提出一种基于残差门卷积的实体识别方法,利用膨胀卷积和带残差的门控线性单元,从多个时序维度同步考虑词间的语义关联,借助门控单元调整流向下一层神经元的信息量,缓解跨层传播的梯度消失问题,同时结合注意力机制捕捉词间的相关语义.在公开命名实体识别数据集和专业领域数据集上运行结果表明,与传统的实体抽取框架相比,基于残差门卷积命名实体算法的速度和精度都有较强的竞争优势,体现出算法的优越性和强鲁棒性.
Author 苏丰龙
景宁
孙承哲
AuthorAffiliation 国防科技大学电子科学学院, 长沙 410073
AuthorAffiliation_xml – name: 国防科技大学电子科学学院, 长沙 410073
Author_FL JING Ning
SU Fenglong
SUN Chengzhe
Author_FL_xml – sequence: 1
  fullname: SU Fenglong
– sequence: 2
  fullname: SUN Chengzhe
– sequence: 3
  fullname: JING Ning
Author_xml – sequence: 1
  fullname: 苏丰龙
– sequence: 2
  fullname: 孙承哲
– sequence: 3
  fullname: 景宁
BookMark eNrjYmDJy89LZWCQNTTQMzQ2MrDUz9IzMDG31LUwMDLWMzIwMtQzNDBiYeCEC3Iw8BYXZyYZGBoZWViamRhyMui-mNf7dELHkx1dT3Z0P5vW_nxWy7N13U-3r3s5fcXT3u3Pl69_um7ek72Tn3Xtfdo_jYeBNS0xpziVF0pzM4S6uYY4e-j6-Lt7Ojv66BYDLTTQNTa2TDFKSjJJsTBOTjUxMEmzNE01TjEwTDJKSzE1SUo0M0oysDQwMbYwskwyMzM2NjayMDRITjYGMoEKgRLcDOoQc8sT89IS89Ljs_JLi_KANsYnZaVUVCQB_WZkYAiExgDaOlUB
ContentType Journal Article
Copyright Copyright © Wanfang Data Co. Ltd. All Rights Reserved.
Copyright_xml – notice: Copyright © Wanfang Data Co. Ltd. All Rights Reserved.
DBID 2B.
4A8
92I
93N
PSX
TCJ
DOI 10.13209/j.0479-8023.2021.102
DatabaseName Wanfang Data Journals - Hong Kong
WANFANG Data Centre
Wanfang Data Journals
万方数据期刊 - 香港版
China Online Journals (COJ)
China Online Journals (COJ)
DatabaseTitleList
DeliveryMethod fulltext_linktorsrc
Discipline Sciences (General)
DocumentTitle_FL A Context-Fusion Method for Entity Extraction Based on Residual Gated Convolution Neural Network
EndPage 76
ExternalDocumentID bjdxxb202201010
GroupedDBID -01
23M
2B.
4A8
5GY
8FE
8FH
92E
92I
93N
AAABJ
AAQEF
ABJNI
ABLSY
ABPYQ
ABUWG
ABVRV
ACECN
ACGFS
ACPRK
ACTRF
ADCJG
ADGMY
ADMLS
ADMQQ
ADRFT
ADZSZ
AENOO
AEXCR
AFKRA
AFSCH
AFTSM
AFZMG
AHIBC
AIVZI
AJZVN
ALMA_UNASSIGNED_HOLDINGS
BBNVY
BENPR
BHPHI
BPHCQ
BVBZV
CCEZO
CCPQU
CCVFK
CW9
HCIFZ
LK8
M7P
P2P
PDI
PHGZM
PHGZT
PMFND
PQQKQ
PSX
TCJ
TGP
U1G
U5K
UY8
ID FETCH-LOGICAL-s1020-339d2bb4d83ce404f95e3d01b2fd54ba62b09043829b663332810cc36335e3043
ISSN 0479-8023
IngestDate Thu May 29 04:00:37 EDT 2025
IsPeerReviewed false
IsScholarly true
Issue 1
Keywords 实体抽取;残差门卷积;梯度消失;注意力机制
Language Chinese
LinkModel OpenURL
MergedId FETCHMERGED-LOGICAL-s1020-339d2bb4d83ce404f95e3d01b2fd54ba62b09043829b663332810cc36335e3043
PageCount 8
ParticipantIDs wanfang_journals_bjdxxb202201010
PublicationCentury 2000
PublicationDate 2022-01-20
PublicationDateYYYYMMDD 2022-01-20
PublicationDate_xml – month: 01
  year: 2022
  text: 2022-01-20
  day: 20
PublicationDecade 2020
PublicationTitle 北京大学学报(自然科学版)
PublicationTitle_FL Acta Scientiarum Naturalium Universitatis Pekinensis
PublicationYear 2022
Publisher 国防科技大学电子科学学院, 长沙 410073
Publisher_xml – name: 国防科技大学电子科学学院, 长沙 410073
SSID ssib012289641
ssib051370299
ssj0030172
ssib001522812
ssib002258124
ssib000862120
ssib030194702
ssib008143590
ssib002040163
ssib006703675
ssib038076459
Score 2.3155591
Snippet ...
SourceID wanfang
SourceType Aggregation Database
StartPage 69
Title 融合上下文的残差门卷积实体抽取
URI https://d.wanfangdata.com.cn/periodical/bjdxxb202201010
Volume 58
hasFullText 1
inHoldings 1
isFullTextHit
isPrint
link http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwnR1NaxUxMNT24kWsH_jNOxhQZOtuvjY5bl73UQSLSAu9lf16Sg9PsK2UngWlpSgIFTz0qqd6bC_9N32t_gtnsvt2t_aBH_BY5iWTycxkk8xkkwkh95NEh4nCrTgJBwclCTJPFybwci4D3c8Sofp4UPjpvJpbFE-W5NLEhTetXUvra-lMtjn2XMn_tCqkQbviKdl_aNmaKCQADO0LT2hheP5VG9NYUxNTPUtjSY1PtaaxoFZTHY0AS2NFjaLg88chNRHVAlOi2GVJakOEY0MNABpTgJp1yBH8epgCCFALEpylhmNxoG9dpbpHy6tiRwauS-wiNcSPaNR1FARSQ2CWRqoFOFKRpHGP2q7jXyOrUYQMAKtWVZyYoFUKsoxDLkvVaxmutEWmSumtj5JZYL-F4qgY4-qGTCchSGVZg6Iwv5a9vOxltDTCcI-Jx_z6ZXblrdMHaBEqZS2WnXzaH6cDQAAm5Yghf5ygNWAcRww7hPsjHeMKKzPmkcBNKLw1povQeBhzrz0BSX2uo5WzSXmJTWWXlNfknJvxOHMRY1dmatIzoIoAQ3I0U3y98TJdyTc2UtQVxhb0L5ApFoa4vWHKxvPPnp9xdAPWNqwZOxOoDsb-QLUNQYmWYjOSY1y3lmGq0SxvvscGQM2oxhCHWcaIsAlEh7cgtKMcyYBDLjrOpY3FceUCbayRzNXZPFTG43GqcEfyBv1k8KJlPS5cJpcqt68TlX14mkxsvrxCpquJdbXzoIr-_vAq8X7s7Qw_vj8-3Do-3D7ZfXf65e3J_vbwYP_n52_DnYPTr9-H-3vHR59Oto6GH3avkcVevNCd86o7TbzVAFdqODc5S1ORa54Vwhd9Iwue-0HK-rkUaaJY6hv3dd6k4AxwDmr3s4wDCIiQcZ1MDl4NihukAyowoh-oAlwykae-NtIU4F9lwmQZGOo3SaeSebkas1aXf2v_W39GuU0uNl3rDplce71e3AU7fC29V700vwAjkJjP
linkProvider ProQuest
openUrl ctx_ver=Z39.88-2004&ctx_enc=info%3Aofi%2Fenc%3AUTF-8&rfr_id=info%3Asid%2Fsummon.serialssolutions.com&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=article&rft.atitle=%E8%9E%8D%E5%90%88%E4%B8%8A%E4%B8%8B%E6%96%87%E7%9A%84%E6%AE%8B%E5%B7%AE%E9%97%A8%E5%8D%B7%E7%A7%AF%E5%AE%9E%E4%BD%93%E6%8A%BD%E5%8F%96&rft.jtitle=%E5%8C%97%E4%BA%AC%E5%A4%A7%E5%AD%A6%E5%AD%A6%E6%8A%A5%EF%BC%88%E8%87%AA%E7%84%B6%E7%A7%91%E5%AD%A6%E7%89%88%EF%BC%89&rft.au=%E8%8B%8F%E4%B8%B0%E9%BE%99&rft.au=%E5%AD%99%E6%89%BF%E5%93%B2&rft.au=%E6%99%AF%E5%AE%81&rft.date=2022-01-20&rft.pub=%E5%9B%BD%E9%98%B2%E7%A7%91%E6%8A%80%E5%A4%A7%E5%AD%A6%E7%94%B5%E5%AD%90%E7%A7%91%E5%AD%A6%E5%AD%A6%E9%99%A2%2C+%E9%95%BF%E6%B2%99+410073&rft.issn=0479-8023&rft.volume=58&rft.issue=1&rft.spage=69&rft.epage=76&rft_id=info:doi/10.13209%2Fj.0479-8023.2021.102&rft.externalDocID=bjdxxb202201010
thumbnail_s http://utb.summon.serialssolutions.com/2.0.0/image/custom?url=http%3A%2F%2Fwww.wanfangdata.com.cn%2Fimages%2FPeriodicalImages%2Fbjdxxb%2Fbjdxxb.jpg