Multistream BertGCN for Sentiment Classification Based on Cross-Document Learning

Very recently, the BERT graph convolutional network (BertGCN) model has attracted much attention from researchers due to its good text classification performance. However, just using original documents in the corpus to construct the topology of graphs for GCN-based models may lose some effective inf...

Full description

Saved in:
Bibliographic Details
Published inQuantum engineering Vol. 2023; pp. 1 - 9
Main Authors Li, Meng, Xie, Yujin, Yang, Weifeng, Chen, Shenyu
Format Journal Article
LanguageEnglish
Published Hoboken Hindawi 13.11.2023
John Wiley & Sons, Inc
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Very recently, the BERT graph convolutional network (BertGCN) model has attracted much attention from researchers due to its good text classification performance. However, just using original documents in the corpus to construct the topology of graphs for GCN-based models may lose some effective information. In this paper, we focus on sentiment classification, an important branch of text classification, and propose the multistream BERT graph convolutional network (MS-BertGCN) for sentiment classification based on cross-document learning. In the proposed method, we first combine the documents in the training set based on within-class similarity. Then, each heterogeneous graph is constructed using a group of combinations of documents for the single-stream BertGCN model. Finally, we construct multistream-BertGCN (MS-BertGCN) based on multiple heterogeneous graphs constructed from different groups of combined documents. The experimental results show that our MS-BertGCN model outperforms state-of-the-art methods on sentiment classification tasks.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ISSN:2577-0470
2577-0470
DOI:10.1155/2023/3668960