A Novel Encoder-Decoder Knowledge Graph Completion Model for Robot Brain

With the rapid development of artificial intelligence, Cybernetics, and other High-tech subject technology, robots have been made and used in increasing fields. And studies on robots have attracted growing research interests from different communities. The knowledge graph can act as the brain of a r...

Full description

Saved in:
Bibliographic Details
Published inFrontiers in neurorobotics Vol. 15; p. 674428
Main Authors Song, Yichen, Li, Aiping, Tu, Hongkui, Chen, Kai, Li, Chenchen
Format Journal Article
LanguageEnglish
Published Switzerland Frontiers Research Foundation 11.05.2021
Frontiers Media S.A
Subjects
Online AccessGet full text
ISSN1662-5218
1662-5218
DOI10.3389/fnbot.2021.674428

Cover

Loading…
More Information
Summary:With the rapid development of artificial intelligence, Cybernetics, and other High-tech subject technology, robots have been made and used in increasing fields. And studies on robots have attracted growing research interests from different communities. The knowledge graph can act as the brain of a robot and provide intelligence, to support the interaction between the robot and the human beings. Although the large-scale knowledge graphs contain a large amount of information, they are still incomplete compared with real-world knowledge. Most existing methods for knowledge graph completion focus on entity representation learning. However, the importance of relation representation learning is ignored, as well as the cross-interaction between entities and relations. In this paper, we propose an encoder-decoder model which embeds the interaction between entities and relations, and adds a gate mechanism to control the attention mechanism. Experimental results show that our method achieves better link prediction performance than state-of-the-art embedding models on two benchmark datasets, WN18RR and FB15k-237.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
content type line 23
Reviewed by: Xiang Lin, Shanghai Jiao Tong University, China; Li Xiaoyong, Beijing University of Posts and Telecommunications (BUPT), China; Jinqiao Shi, Beijing University of Posts and Telecommunications (BUPT), China
Edited by: Zhaoquan Gu, Guangzhou University, China
ISSN:1662-5218
1662-5218
DOI:10.3389/fnbot.2021.674428