SAKG-BERT: Enabling Language Representation With Knowledge Graphs for Chinese Sentiment Analysis
Sentiment analysis of online reviews is an important task in natural language processing. It has received much attention not only in academia but also in industry. Data have become an important source of competitive intelligence. Various pretraining models such as BERT and ERNIE have made great achi...
Saved in:
Published in | IEEE access Vol. 9; pp. 101695 - 101701 |
---|---|
Main Authors | , , |
Format | Journal Article |
Language | English |
Published |
Piscataway
IEEE
2021
The Institute of Electrical and Electronics Engineers, Inc. (IEEE) |
Subjects | |
Online Access | Get full text |
Cover
Loading…
Summary: | Sentiment analysis of online reviews is an important task in natural language processing. It has received much attention not only in academia but also in industry. Data have become an important source of competitive intelligence. Various pretraining models such as BERT and ERNIE have made great achievements in the task of natural language processing, but lack domain-specific knowledge. Knowledge graphs can enhance language representation. Furthermore, knowledge graphs have high entity / concept coverage and strong semantic expression ability. We propose a sentiment analysis knowledge graph (SAKG)-BERT model that combines sentiment analysis knowledge and the language representation model BERT. To improve the interpretability of the deep learning algorithm, we construct an SAKG in which triples are injected into sentences as domain knowledge. Our investigation reveals promising results in sentence completion and sentiment analysis tasks. |
---|---|
ISSN: | 2169-3536 2169-3536 |
DOI: | 10.1109/ACCESS.2021.3098180 |