Semantic Matching Template-Based Zero-Shot Relation Triplet Extraction
To address the limitation of annotated datasets confined to fixed relation domains, which hampers the effective extraction of triplets, especially for novel relation types, our work introduces an innovative approach. We propose a method for training large-scale language models using prompt templates...
Saved in:
Published in | IEICE Transactions on Information and Systems Vol. E108.D; no. 3; pp. 277 - 285 |
---|---|
Main Authors | , , , |
Format | Journal Article |
Language | English |
Published |
Tokyo
The Institute of Electronics, Information and Communication Engineers
01.03.2025
Japan Science and Technology Agency |
Subjects | |
Online Access | Get full text |
ISSN | 0916-8532 1745-1361 |
DOI | 10.1587/transinf.2024EDP7137 |
Cover
Loading…
Summary: | To address the limitation of annotated datasets confined to fixed relation domains, which hampers the effective extraction of triplets, especially for novel relation types, our work introduces an innovative approach. We propose a method for training large-scale language models using prompt templates designed for zero-shot learning in relation triplet extraction tasks. By utilizing these specially crafted prompt templates in combination with fine-grained matching scoring rules, we transform the structured prediction task into a cloze task. This transformation aligns the task more closely with the intrinsic capabilities of the language model, facilitating a more natural processing flow.Experimental evaluations on two public datasets show that our method achieves stable and enhanced performance compared to baseline models. This improvement underscores the efficiency and potential of our approach in facilitating zero-shot extraction of relation triplets, thus broadening the scope of applicable relation types without the need for domain-specific training data. |
---|---|
Bibliography: | ObjectType-Article-1 SourceType-Scholarly Journals-1 ObjectType-Feature-2 content type line 14 |
ISSN: | 0916-8532 1745-1361 |
DOI: | 10.1587/transinf.2024EDP7137 |