SPECE: Subject Position Encoder in Complex Embedding for Relation Extraction

As a crucial component of many natural language processing tasks, extracting entities and relations transforms unstructured text information into structured data, providing essential support for constructing knowledge graphs (KGs). However, current entity relation extraction models often prioritize...

Full description

Saved in:
Bibliographic Details
Published inElectronics (Basel) Vol. 13; no. 13; p. 2571
Main Authors Wu, Shangjia, Guo, Zhiqiang, Huang, Xiaofeng, Zhang, Jialiang, Ni, Yingfang
Format Journal Article
LanguageEnglish
Published Basel MDPI AG 01.07.2024
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:As a crucial component of many natural language processing tasks, extracting entities and relations transforms unstructured text information into structured data, providing essential support for constructing knowledge graphs (KGs). However, current entity relation extraction models often prioritize the extraction of richer semantic features or the optimization of relation extraction methods, overlooking the significance of positional information and subject characteristics in this task. To solve this problem, we introduce the subject position-based complex exponential embedding for entity relation extraction model (SPECE). The encoder module of this model ingeniously combines a randomly initialized dilated convolutional network with a BERT encoder. Notably, it determines the initial position of the predicted subject based on semantic cues. Furthermore, it achieves a harmonious integration of positional encoding features and textual features through the adoption of the complex exponential embedding method. The experimental outcomes on both the NYT and WebNLG datasets reveal that, when compared to other baseline models, our proposed SPECE model demonstrates significant improvements in the F1 score on both datasets. This further validates its efficacy and superiority.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ISSN:2079-9292
2079-9292
DOI:10.3390/electronics13132571