Entity relation joint extraction model combining pointer network and attention mechanism based on relative position embedding

Extracting entity relations from unstructured text is an important step in constructing a knowledge graph, but most current methods are ineffective in dealing with the complex problem of overlapping entities. In this paper, we propose a novel entity relation joint extraction model. It extracts subje...

Full description

Saved in:
Bibliographic Details
Main Authors Liu, Wei, Xu, Jun, Xu, Chunsheng
Format Conference Proceeding
LanguageEnglish
Published SPIE 19.02.2024
Online AccessGet full text

Cover

Loading…
More Information
Summary:Extracting entity relations from unstructured text is an important step in constructing a knowledge graph, but most current methods are ineffective in dealing with the complex problem of overlapping entities. In this paper, we propose a novel entity relation joint extraction model. It extracts subjects through pointer annotation, fuses the extracted subjects with the sentence vector, then inputs them into an attention layer (Attention Mechanism Based on Relative Position Embedding, AMBRPE) to enhance feature expression ability. Under predefined relation conditions, the model extracts objects corresponding to the extracted subjects to generate relation triplets. And the model can effectively solve the problem of overlapping entities through hierarchical pointer annotation. In addition, we introduce an Adversarial Training Component (ATC) into the model, which generates adversarial samples for training and acts as a text data augmentation method to improve model generalization ability. On the public datasets NYT and WebNLG, we conducted extensive experiments and the results show that our model outperforms the cascaded binary tagging framework (CasRel) by 2.1 and 0.9 percentage points, respectively. Moreover, the effectiveness of the proposed model is verified through ablation experiments.
Bibliography:Conference Location: Changchun, China
Conference Date: 2023-10-20|2023-10-22
ISBN:1510674446
9781510674448
ISSN:0277-786X
DOI:10.1117/12.3021506