Research on Sentiment Analysis of Tibetan Short Text Based on Dual-channel Hybrid Neural Network

In response to the problem of varying degrees of loss of textual semantic information with the increase of model depth in a single-channel hybrid neural network model, this paper proposes a dual-channel hybrid neural network model-ALDCBAT based on the idea of multi-channel hybrid neural network, usi...

Full description

Saved in:
Bibliographic Details
Published in2023 IEEE 4th International Conference on Pattern Recognition and Machine Learning (PRML) pp. 377 - 384
Main Authors Zhu, Yulei, Luosai, Baima, Zhou, Liyuan, Qun, Nuo, Nyima, Tashi
Format Conference Proceeding
LanguageEnglish
Published IEEE 04.08.2023
Subjects
Online AccessGet full text

Cover

Loading…
Abstract In response to the problem of varying degrees of loss of textual semantic information with the increase of model depth in a single-channel hybrid neural network model, this paper proposes a dual-channel hybrid neural network model-ALDCBAT based on the idea of multi-channel hybrid neural network, using ALBERT pre-training model, convolutional neural network and bidirectional gated unit network. The model first vectorizes Tibetan texts using ALBERT pre-training model, and then inputs the word vectors into the TextCNN model and the BiGRU model respectively. Secondly, an attention mechanism is introduced to enhance the text feature extraction ability of the BiGRU model. Finally, the output of the TextCNN model is concatenated with the output of the BiGRU model and the attention mechanism as the final output. Experimental results show that the classification accuracy of the dual-channel hybrid neural network model proposed in this paper is 91.12%, which partly solves the problem of loss of semantic information and effectively improves the classification accuracy of Tibetan sentiment analysis.
AbstractList In response to the problem of varying degrees of loss of textual semantic information with the increase of model depth in a single-channel hybrid neural network model, this paper proposes a dual-channel hybrid neural network model-ALDCBAT based on the idea of multi-channel hybrid neural network, using ALBERT pre-training model, convolutional neural network and bidirectional gated unit network. The model first vectorizes Tibetan texts using ALBERT pre-training model, and then inputs the word vectors into the TextCNN model and the BiGRU model respectively. Secondly, an attention mechanism is introduced to enhance the text feature extraction ability of the BiGRU model. Finally, the output of the TextCNN model is concatenated with the output of the BiGRU model and the attention mechanism as the final output. Experimental results show that the classification accuracy of the dual-channel hybrid neural network model proposed in this paper is 91.12%, which partly solves the problem of loss of semantic information and effectively improves the classification accuracy of Tibetan sentiment analysis.
Author Luosai, Baima
Zhou, Liyuan
Nyima, Tashi
Qun, Nuo
Zhu, Yulei
Author_xml – sequence: 1
  givenname: Yulei
  surname: Zhu
  fullname: Zhu, Yulei
  email: zhuyulei@utibet.edu.cn
  organization: Tibet University,School of Information Science and Technology,Lhasa,China
– sequence: 2
  givenname: Baima
  surname: Luosai
  fullname: Luosai, Baima
  email: LC01010507@163.com
  organization: Tibet University,School of Information Science and Technology,Lhasa,China
– sequence: 3
  givenname: Liyuan
  surname: Zhou
  fullname: Zhou, Liyuan
  email: zliyuan@utibet.edu.cn
  organization: Tibet University,School of Information Science and Technology,Lhasa,China
– sequence: 4
  givenname: Nuo
  surname: Qun
  fullname: Qun, Nuo
  email: q_nuo@utibet.edu.cn
  organization: Tibet University,School of Information Science and Technology,Lhasa,China
– sequence: 5
  givenname: Tashi
  surname: Nyima
  fullname: Nyima, Tashi
  email: nmzx@utibet.edu.cn
  organization: Tibet University,School of Information Science and Technology,Lhasa,China
BookMark eNo1j8tOwzAQRY0ECyj9AyT8Awm2J0ntZSnQIpWHSliXcTNWLVIHOa6gf08QsLlncR_SPWPHoQvE2KUUuZTCXD2vHpalKSeQK6EglwIKDVV1xMZmYjSUAlQBAk7Z24p6wrjZ8i7wFwrJ7wbh04Dtofc97xyvvaWEg7vtYuI1fSV-jT01P42bPbbZZoshUMsXBxt9wx9pH7EdkD67-H7OThy2PY3_OGKvd7f1bJEtn-b3s-ky81KalCndQClBOJQaFEmlC1VSMcGqsE21MaiktaUb3rghaCxVhRa2cYgNoNMORuzid9cT0foj-h3Gw_r_OHwDU6JTbQ
ContentType Conference Proceeding
DBID 6IE
6IL
CBEJK
RIE
RIL
DOI 10.1109/PRML59573.2023.10348366
DatabaseName IEEE Electronic Library (IEL) Conference Proceedings
IEEE Xplore POP ALL
IEEE Xplore All Conference Proceedings
IEEE Electronic Library (IEL)
IEEE Proceedings Order Plans (POP All) 1998-Present
DatabaseTitleList
Database_xml – sequence: 1
  dbid: RIE
  name: IEEE Electronic Library (IEL)
  url: https://proxy.k.utb.cz/login?url=https://ieeexplore.ieee.org/
  sourceTypes: Publisher
DeliveryMethod fulltext_linktorsrc
EISBN 9798350324303
EndPage 384
ExternalDocumentID 10348366
Genre orig-research
GrantInformation_xml – fundername: National Natural Science Foundation of China
  funderid: 10.13039/501100001809
GroupedDBID 6IE
6IL
CBEJK
RIE
RIL
ID FETCH-LOGICAL-i119t-28d35130fa1832e128425e47a64bd6c9a21bb5f202fd359be6480bdfaad3af8f3
IEDL.DBID RIE
IngestDate Wed Jan 10 09:27:53 EST 2024
IsPeerReviewed false
IsScholarly false
Language English
LinkModel DirectLink
MergedId FETCHMERGED-LOGICAL-i119t-28d35130fa1832e128425e47a64bd6c9a21bb5f202fd359be6480bdfaad3af8f3
PageCount 8
ParticipantIDs ieee_primary_10348366
PublicationCentury 2000
PublicationDate 2023-Aug.-4
PublicationDateYYYYMMDD 2023-08-04
PublicationDate_xml – month: 08
  year: 2023
  text: 2023-Aug.-4
  day: 04
PublicationDecade 2020
PublicationTitle 2023 IEEE 4th International Conference on Pattern Recognition and Machine Learning (PRML)
PublicationTitleAbbrev PRML
PublicationYear 2023
Publisher IEEE
Publisher_xml – name: IEEE
Score 1.8577095
Snippet In response to the problem of varying degrees of loss of textual semantic information with the increase of model depth in a single-channel hybrid neural...
SourceID ieee
SourceType Publisher
StartPage 377
SubjectTerms Analytical models
BiGRU
Feature extraction
Logic gates
Machine learning
Neural networks
pretraining model
Semantics
Sentiment analysis
TextCNN
Tibetan sentiment analysis
Title Research on Sentiment Analysis of Tibetan Short Text Based on Dual-channel Hybrid Neural Network
URI https://ieeexplore.ieee.org/document/10348366
hasFullText 1
inHoldings 1
isFullTextHit
isPrint
link http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwjV1LS8NAEF60J08qVnyzB68b2-4jydVHKWJLsS30Vnezs1gsiUhy0F_vTNooCoKnhLBLwgy730zm-2YZu6TSlMaMTaSITkLZ2ItEJ5nwHZ1BFlPhjcTJw5EZzNT9XM83YvVaCwMANfkMIrqta_m-yCr6VYYrXKpEGrPNtjFzW4u1Npytbie9Gj8OH3SqYxnRmeBRM_rHuSk1bPR32ah54Zot8hJVpYuyj1-9GP_9RXus_a3Q4-Mv7NlnW5AfsKeGRseLnE-IBkSzedN3hBeBT5cOMBzkk2cMu_kUd2Z-jTjmacZtZVeChMA5rPjgnaRcnHp32BVearJ4m836d9ObgdicoCCW6INS9BIvNaJUsLRygbCop0HF1ijnTZbaXtc5HdBcAQemDoxKOs4Ha720IQnykLXyIocjxpWF4DxIiwGHsikkJkhFyYu2ppul_pi1yTyL13WTjEVjmZM_np-yHfJSzaVTZ6xVvlVwjvheuovar58hEqbf
linkProvider IEEE
linkToHtml http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwjV3PS8MwFA46D3pSceJvc_Cauq5J2l79MapuY7gOdptJ84LD0Yq0B_3rzetWRUHw1FISWt4j-V76vu89Qi4wNSXciY3FDp0YV6FhkYgyZjoigyzExBuKkwdDmUz4_VRMV2L1WgsDADX5DDy8rXP5psgq_FXmVnjAo0DKdbLhgF_4S7nWirXld-LL0eOgL2IRBh52Bfea8T86p9TA0dsmw-aVS77Ii1eV2ss-flVj_Pc37ZD2t0aPjr7QZ5esQb5HnhoiHS1yOkYiEM6mTeURWliazjW4gJCOn13gTVO3N9Mrh2QGZ9xUasFQCpzDgibvKOaiWL1DLdylpou3yaR3m14nbNVDgc2dF0rWjUwgHE5ZhWsXEI26AnioJNdGZrHq-loL68xl3cBYg-RRRxurlAmUjWywT1p5kcMBoVyB1QYC5UIOrmKIpA04Hl-Ekn4Wm0PSRvPMXpdlMmaNZY7-eH5ONpN00J_174YPx2QLPVYz6_gJaZVvFZw6tC_1We3jTwP7qig
openUrl ctx_ver=Z39.88-2004&ctx_enc=info%3Aofi%2Fenc%3AUTF-8&rfr_id=info%3Asid%2Fsummon.serialssolutions.com&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Abook&rft.genre=proceeding&rft.title=2023+IEEE+4th+International+Conference+on+Pattern+Recognition+and+Machine+Learning+%28PRML%29&rft.atitle=Research+on+Sentiment+Analysis+of+Tibetan+Short+Text+Based+on+Dual-channel+Hybrid+Neural+Network&rft.au=Zhu%2C+Yulei&rft.au=Luosai%2C+Baima&rft.au=Zhou%2C+Liyuan&rft.au=Qun%2C+Nuo&rft.date=2023-08-04&rft.pub=IEEE&rft.spage=377&rft.epage=384&rft_id=info:doi/10.1109%2FPRML59573.2023.10348366&rft.externalDocID=10348366