SAKG-BERT: Enabling Language Representation With Knowledge Graphs for Chinese Sentiment Analysis

Sentiment analysis of online reviews is an important task in natural language processing. It has received much attention not only in academia but also in industry. Data have become an important source of competitive intelligence. Various pretraining models such as BERT and ERNIE have made great achi...

Full description

Saved in:
Bibliographic Details
Published inIEEE access Vol. 9; pp. 101695 - 101701
Main Authors Yan, Xiaoyan, Jian, Fanghong, Sun, Bo
Format Journal Article
LanguageEnglish
Published Piscataway IEEE 2021
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Sentiment analysis of online reviews is an important task in natural language processing. It has received much attention not only in academia but also in industry. Data have become an important source of competitive intelligence. Various pretraining models such as BERT and ERNIE have made great achievements in the task of natural language processing, but lack domain-specific knowledge. Knowledge graphs can enhance language representation. Furthermore, knowledge graphs have high entity / concept coverage and strong semantic expression ability. We propose a sentiment analysis knowledge graph (SAKG)-BERT model that combines sentiment analysis knowledge and the language representation model BERT. To improve the interpretability of the deep learning algorithm, we construct an SAKG in which triples are injected into sentences as domain knowledge. Our investigation reveals promising results in sentence completion and sentiment analysis tasks.
ISSN:2169-3536
2169-3536
DOI:10.1109/ACCESS.2021.3098180