Joint relational triple extraction based on topic constraints and multicore attention fusion Joint relational triple extraction based
Relation extraction is a fundamental task in natural language processing, closely linked with named entity recognition. While existing methods for extracting relational triples can improve performance to some extent, they often treat identified entities as discrete categorical labels, overlooking th...
Saved in:
Published in | The Journal of supercomputing Vol. 81; no. 11 |
---|---|
Main Authors | , , |
Format | Journal Article |
Language | English |
Published |
New York
Springer US
21.07.2025
|
Subjects | |
Online Access | Get full text |
Cover
Loading…
Summary: | Relation extraction is a fundamental task in natural language processing, closely linked with named entity recognition. While existing methods for extracting relational triples can improve performance to some extent, they often treat identified entities as discrete categorical labels, overlooking the contextual and thematic attributes of those entities. Additionally, current models frequently ignore the textual content outside the entities, resulting in poor interaction between sub-modules and the underutilization of valuable semantic information. To address these shortcomings, we propose a novel joint entity-relation extraction model named Topic Constraint and Multicore Attention-based Joint Extraction (TCMA
_
JE). Our model introduces the Subject Topic Filtering module, which enriches entity representations by leveraging subject vectors to derive contextually relevant features. Furthermore, the Multicore Semantic Fusion module employs convolutional neural networks to establish an attention-based fusion mechanism, integrating entity vectors, rich-typed features, and global text representations in a comprehensive manner. This deep semantic fusion significantly enhances the model’s ability to utilize textual information. Extensive experiments conducted on the NYT and WebNLG datasets demonstrate that our model achieves superior performance. |
---|---|
ISSN: | 1573-0484 |
DOI: | 10.1007/s11227-025-07631-x |