Noise-Tolerant Supervised Relation Extraction
Existing supervised relation extraction methods share generate flaws that they have improvable feature extractors and are uncapable of eliminating task-irrelevant words. To handle such issues, this paper proposed a relation extraction model that can avoid interference from task-irrelevant informatio...
Saved in:
Published in | 2021 International Conference on Artificial Intelligence and Electromechanical Automation (AIEA) pp. 247 - 253 |
---|---|
Main Authors | , , , |
Format | Conference Proceeding |
Language | English |
Published |
IEEE
01.05.2021
|
Subjects | |
Online Access | Get full text |
Cover
Loading…
Summary: | Existing supervised relation extraction methods share generate flaws that they have improvable feature extractors and are uncapable of eliminating task-irrelevant words. To handle such issues, this paper proposed a relation extraction model that can avoid interference from task-irrelevant information. In this model, input texts are converted to digital sequences initially, and then these sequences are fed into Mogrifier BiGRU to encode semantic features. While in the following GateAttention module, encoded entities are used to sift task-irrelevant words out with gate mechanism and then the left ones are fused into sentence vectors using attention network. Finally, the sentence vectors are fed into the softmax function for relation classification. Experimental result on SemEval-2010 Task8 reveals that the proposed model gets an 85.24 F1 value without relying on language tools or any additional information, which indicates this model has a stronger capability of feature extraction and higher resistance to noise interference compared with other ones. |
---|---|
DOI: | 10.1109/AIEA53260.2021.00059 |