Distillation Language Adversarial Network for Cross-lingual Sentiment Analysis

Cross-lingual sentiment analysis aims at tackling the lack of annotated corpus of variant low-resource languages by training a common classifier, to transfer the knowledge learned from the source language to target languages. Existing large-scale pre-trained language models have got remarkable impro...

Full description

Saved in:
Bibliographic Details
Published in2022 International Conference on Asian Language Processing (IALP) pp. 45 - 50
Main Authors Wang, Deheng, Yang, Aimin, Zhou, Yongmei, Xie, Fenfang, Ouyang, Zhouhao, Peng, Sancheng
Format Conference Proceeding
LanguageEnglish
Published IEEE 27.10.2022
Subjects
Online AccessGet full text
DOI10.1109/IALP57159.2022.9961285

Cover

Abstract Cross-lingual sentiment analysis aims at tackling the lack of annotated corpus of variant low-resource languages by training a common classifier, to transfer the knowledge learned from the source language to target languages. Existing large-scale pre-trained language models have got remarkable improvements in cross-lingual sentiment analysis. However, these models still suffer from lack of annotated corpus for low-resource languages. To address such problems, we propose an end-to-end sentiment analysis architecture for cross-lingual sentiment analysis, named Distillation Language Adversarial Network (DLAN). Based on pre-trained model, DLAN uses adversarial learning with knowledge distillation to learn language invariant features without extra training data. We evaluate the proposed method on Amazon review dataset, a multilingual sentiment dataset. The results illustrate that DLAN is more effective than the baseline methods in cross-lingual sentiment analysis.
AbstractList Cross-lingual sentiment analysis aims at tackling the lack of annotated corpus of variant low-resource languages by training a common classifier, to transfer the knowledge learned from the source language to target languages. Existing large-scale pre-trained language models have got remarkable improvements in cross-lingual sentiment analysis. However, these models still suffer from lack of annotated corpus for low-resource languages. To address such problems, we propose an end-to-end sentiment analysis architecture for cross-lingual sentiment analysis, named Distillation Language Adversarial Network (DLAN). Based on pre-trained model, DLAN uses adversarial learning with knowledge distillation to learn language invariant features without extra training data. We evaluate the proposed method on Amazon review dataset, a multilingual sentiment dataset. The results illustrate that DLAN is more effective than the baseline methods in cross-lingual sentiment analysis.
Author Ouyang, Zhouhao
Zhou, Yongmei
Wang, Deheng
Yang, Aimin
Peng, Sancheng
Xie, Fenfang
Author_xml – sequence: 1
  givenname: Deheng
  surname: Wang
  fullname: Wang, Deheng
  email: 1148684516@qq.com
  organization: School of Cyber Security, Guang Dong University of Foreign Studies,Guangzhou,China
– sequence: 2
  givenname: Aimin
  surname: Yang
  fullname: Yang, Aimin
  email: amyang18@163.com
  organization: School of Cyber Security, Guang Dong University of Foreign Studies,Guangzhou,China
– sequence: 3
  givenname: Yongmei
  surname: Zhou
  fullname: Zhou, Yongmei
  email: yongmeizhou@gdufs.edu.cn
  organization: School of Cyber Security, Guang Dong University of Foreign Studies,Guangzhou,China
– sequence: 4
  givenname: Fenfang
  surname: Xie
  fullname: Xie, Fenfang
  email: xiefragrance@163.com
  organization: Guangdong University of Foreign Studies,Laboratory of Language Engineering and Computing,Guangzhou,China
– sequence: 5
  givenname: Zhouhao
  surname: Ouyang
  fullname: Ouyang, Zhouhao
  email: tal-darim@foxmail.com
  organization: School of Computing, University of Leeds,Leeds,West Yorkshire,United Kingdom,LS2 9JT
– sequence: 6
  givenname: Sancheng
  surname: Peng
  fullname: Peng, Sancheng
  email: psc346@aliyun.com
  organization: Guangdong University of Foreign Studies,Laboratory of Language Engineering and Computing,Guangzhou,China
BookMark eNotj11LwzAYRiPohc79AkHyB1qTtPm6LPVrUOZgej3etm9GMEslqcr-vYq7ec7N4cBzRc7jFJGQW85Kzpm9WzXdRmoubSmYEKW1igsjz8jSasOVkrVWuq4vyfre59mHALOfIu0g7j9hj7QZvzBlSB4CXeP8PaV36qZE2zTlXAT_pwW6xTj7w-_QJkI4Zp-vyYWDkHF54oK8PT68ts9F9_K0apuu8IJVczEyOzrL7VAJoWFUwlkUKCTrba1lP_TGGVCg0SotEVVvwNQaHTeDQ6NdtSA3_12PiLuP5A-QjrvTy-oHUtNOMw
ContentType Conference Proceeding
DBID 6IE
6IL
CBEJK
RIE
RIL
DOI 10.1109/IALP57159.2022.9961285
DatabaseName IEEE Electronic Library (IEL) Conference Proceedings
IEEE Xplore POP ALL
IEEE Xplore All Conference Proceedings
IEEE Electronic Library (IEL)
IEEE Proceedings Order Plans (POP All) 1998-Present
DatabaseTitleList
Database_xml – sequence: 1
  dbid: RIE
  name: IEEE Electronic Library (IEL)
  url: https://proxy.k.utb.cz/login?url=https://ieeexplore.ieee.org/
  sourceTypes: Publisher
DeliveryMethod fulltext_linktorsrc
EISBN 9781665476744
1665476745
EndPage 50
ExternalDocumentID 9961285
Genre orig-research
GroupedDBID 6IE
6IL
CBEJK
RIE
RIL
ID FETCH-LOGICAL-i203t-d09df919c3227ad62f9e2e250b9475bcb8f8a6a7e9675ee6b8a847ef18cfe87f3
IEDL.DBID RIE
IngestDate Thu Jan 18 11:14:12 EST 2024
IsPeerReviewed false
IsScholarly false
Language English
LinkModel DirectLink
MergedId FETCHMERGED-LOGICAL-i203t-d09df919c3227ad62f9e2e250b9475bcb8f8a6a7e9675ee6b8a847ef18cfe87f3
PageCount 6
ParticipantIDs ieee_primary_9961285
PublicationCentury 2000
PublicationDate 2022-Oct.-27
PublicationDateYYYYMMDD 2022-10-27
PublicationDate_xml – month: 10
  year: 2022
  text: 2022-Oct.-27
  day: 27
PublicationDecade 2020
PublicationTitle 2022 International Conference on Asian Language Processing (IALP)
PublicationTitleAbbrev IALP
PublicationYear 2022
Publisher IEEE
Publisher_xml – name: IEEE
Score 1.8205819
Snippet Cross-lingual sentiment analysis aims at tackling the lack of annotated corpus of variant low-resource languages by training a common classifier, to transfer...
SourceID ieee
SourceType Publisher
StartPage 45
SubjectTerms Adaptation models
Adversarial network
Analytical models
Cross-lingual sentiment analysis
Knowledge distillation
Pre-trained model
Predictive models
Sentiment analysis
Training
Training data
Visualization
Title Distillation Language Adversarial Network for Cross-lingual Sentiment Analysis
URI https://ieeexplore.ieee.org/document/9961285
hasFullText 1
inHoldings 1
isFullTextHit
isPrint
link http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwjV07T8MwELbaTkyAWsRbHhhxmriJHY-oUBXUVpWgUrfKj7OEQC2CZOHXc07SIhADW5SHHPlx3539fXeEXGUQSwdcs8RmMQsGj-XCSqYGWhhpHH4WtMPTmRgv0odltmyR650WBgAq8hlE4bI6y3cbW4atsj765mhOszZp4zSrtVqN6DeJVf_-ZjLPJMIzRn2cR83LP6qmVKAx2ifTbXM1V-QlKgsT2c9fmRj_-z8HpPctz6PzHfAckhasu2R2G1bra01to5NmG5JWBZc_dJhmdFZTvin6qXQY0JEFLXqJTx4DZyg0RrdJSnpkMbp7Go5ZUyyBPfN4UDAXK-dVoiyuUKmd4F4BB3RwjEplZqzJfa6FlqAwRAAQJtcITOCT3HrIpR8ckc56s4ZjQiVGTc4adAaESrkHE9JwoeelZJJoRL0T0g19sXqr82Gsmm44_fv2GdkL4xHsPZfnpFO8l3CBQF6Yy2oEvwAeeKCh
linkProvider IEEE
linkToHtml http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwjV09T8MwELVKGWAC1CK-8cCI08RN7HhEhaqFNKpEK3WrbOciIVCLIFn49ZyTUARiYIuSWI78ce-dc--OkKsIfJkB1yywkc-cwWOxsJKpvhZGmgybOe3wJBWjeXi_iBYtcr3RwgBAFXwGnrus_uVna1u6o7IecnM0p9EW2UbcD6NardXIfgNf9cY3yTSSCNDo93HuNa__qJtSwcZwj0y-OqyjRZ69sjCe_fiVi_G_X7RPut8CPTrdQM8BacGqQ9Jbt19f6uA2mjQHkbQqufyu3UKjaR30TZGp0oHDR-bU6CU-eXRRQ64z-pWmpEvmw7vZYMSacgnsifv9gmW-ynIVKIt7VOpM8FwBB6Q4RoUyMtbEeayFlqDQSQAQJtYITZAHsc0hlnn_kLRX6xUcESrRb8qsQTogVMhzMC4RF3IvJYNAI-4dk44bi-VrnRFj2QzDyd-3L8nOaDZJlsk4fTglu25unPXn8oy0i7cSzhHWC3NRzeYntuOj7g
openUrl ctx_ver=Z39.88-2004&ctx_enc=info%3Aofi%2Fenc%3AUTF-8&rfr_id=info%3Asid%2Fsummon.serialssolutions.com&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Abook&rft.genre=proceeding&rft.title=2022+International+Conference+on+Asian+Language+Processing+%28IALP%29&rft.atitle=Distillation+Language+Adversarial+Network+for+Cross-lingual+Sentiment+Analysis&rft.au=Wang%2C+Deheng&rft.au=Yang%2C+Aimin&rft.au=Zhou%2C+Yongmei&rft.au=Xie%2C+Fenfang&rft.date=2022-10-27&rft.pub=IEEE&rft.spage=45&rft.epage=50&rft_id=info:doi/10.1109%2FIALP57159.2022.9961285&rft.externalDocID=9961285