A Knowledge Graph Summarization Model Integrating Attention Alignment and Momentum Distillation
The integrated knowledge graph summarization model improves summary performance by combining text features and entity features. However, the model still has the following shortcomings: the knowledge graph data used introduce data noise that deviates from the original text semantics; and the text and...
Saved in:
Published in | Journal of advanced computational intelligence and intelligent informatics Vol. 29; no. 1; pp. 205 - 214 |
---|---|
Main Authors | , |
Format | Journal Article |
Language | English |
Published |
Tokyo
Fuji Technology Press Co. Ltd
01.01.2025
|
Subjects | |
Online Access | Get full text |
Cover
Loading…
Summary: | The integrated knowledge graph summarization model improves summary performance by combining text features and entity features. However, the model still has the following shortcomings: the knowledge graph data used introduce data noise that deviates from the original text semantics; and the text and knowledge graph entity features cannot be fully integrated. To address these issues, a knowledge graph summarization model integrating attention alignment and momentum distillation (KGS-AAMD) is proposed. The pseudo-targets generated by the momentum distillation model serve as additional supervision signals during training to overcome data noise. The attention-based alignment method lays the foundation for the subsequent full integration of text and entity features by aligning them. Experimental results on two public datasets, namely CNN / Daily Mail and XSum, show that KGS-AAMD surpasses multiple baseline models and ChatGPT in terms of the quality of summary generation, exhibiting significant performance advantages. |
---|---|
Bibliography: | ObjectType-Article-1 SourceType-Scholarly Journals-1 ObjectType-Feature-2 content type line 14 |
ISSN: | 1343-0130 1883-8014 |
DOI: | 10.20965/jaciii.2025.p0205 |