Simultaneous neural machine translation with a reinforced attention mechanism

To translate in real time, a simultaneous translation system should determine when to stop reading source tokens and generate target tokens corresponding to a partial source sentence read up to that point. However, conventional attention‐based neural machine translation (NMT) models cannot produce t...

Full description

Saved in:
Bibliographic Details
Published inETRI journal Vol. 43; no. 5; pp. 775 - 786
Main Authors Lee, YoHan, Shin, JongHun, Kim, YoungKil
Format Journal Article
LanguageEnglish
Published Electronics and Telecommunications Research Institute (ETRI) 01.10.2021
한국전자통신연구원
Subjects
Online AccessGet full text
ISSN1225-6463
2233-7326
DOI10.4218/etrij.2020-0358

Cover

Abstract To translate in real time, a simultaneous translation system should determine when to stop reading source tokens and generate target tokens corresponding to a partial source sentence read up to that point. However, conventional attention‐based neural machine translation (NMT) models cannot produce translations with adequate latency in online scenarios because they wait until a source sentence is completed to compute alignment between the source and target tokens. To address this issue, we propose a reinforced learning (RL)‐based attention mechanism, the reinforced attention mechanism, which allows a neural translation model to jointly train the stopping criterion and a partial translation model. The proposed attention mechanism comprises two modules, one to ensure translation quality and the other to address latency. Different from previous RL‐based simultaneous translation systems, which learn the stopping criterion from a fixed NMT model, the modules can be trained jointly with a novel reward function. In our experiments, the proposed model has better translation quality and comparable latency compared to previous models.
AbstractList AbstractTo translate in real time, a simultaneous translation system should determine when to stop reading source tokens and generate target tokens corresponding to a partial source sentence read up to that point. However, conventional attention‐based neural machine translation (NMT) models cannot produce translations with adequate latency in online scenarios because they wait until a source sentence is completed to compute alignment between the source and target tokens. To address this issue, we propose a reinforced learning (RL)‐based attention mechanism, the reinforced attention mechanism, which allows a neural translation model to jointly train the stopping criterion and a partial translation model. The proposed attention mechanism comprises two modules, one to ensure translation quality and the other to address latency. Different from previous RL‐based simultaneous translation systems, which learn the stopping criterion from a fixed NMT model, the modules can be trained jointly with a novel reward function. In our experiments, the proposed model has better translation quality and comparable latency compared to previous models.
To translate in real time, a simultaneous translation system should determine when to stop reading source tokens and generate target tokens corresponding to a partial source sentence read up to that point. However, conventional attention‐based neural machine translation (NMT) models cannot produce translations with adequate latency in online scenarios because they wait until a source sentence is completed to compute alignment between the source and target tokens. To address this issue, we propose a reinforced learning (RL)‐based attention mechanism, the reinforced attention mechanism, which allows a neural translation model to jointly train the stopping criterion and a partial translation model. The proposed attention mechanism comprises two modules, one to ensure translation quality and the other to address latency. Different from previous RL‐based simultaneous translation systems, which learn the stopping criterion from a fixed NMT model, the modules can be trained jointly with a novel reward function. In our experiments, the proposed model has better translation quality and comparable latency compared to previous models. KCI Citation Count: 0
To translate in real time, a simultaneous translation system should determine when to stop reading source tokens and generate target tokens corresponding to a partial source sentence read up to that point. However, conventional attention‐based neural machine translation (NMT) models cannot produce translations with adequate latency in online scenarios because they wait until a source sentence is completed to compute alignment between the source and target tokens. To address this issue, we propose a reinforced learning (RL)‐based attention mechanism, the reinforced attention mechanism, which allows a neural translation model to jointly train the stopping criterion and a partial translation model. The proposed attention mechanism comprises two modules, one to ensure translation quality and the other to address latency. Different from previous RL‐based simultaneous translation systems, which learn the stopping criterion from a fixed NMT model, the modules can be trained jointly with a novel reward function. In our experiments, the proposed model has better translation quality and comparable latency compared to previous models.
Author Lee, YoHan
Shin, JongHun
Kim, YoungKil
Author_xml – sequence: 1
  givenname: YoHan
  orcidid: 0000-0001-8015-9609
  surname: Lee
  fullname: Lee, YoHan
  email: carep@etri.re.kr
  organization: Electronics and Telecommunications Research Institutes
– sequence: 2
  givenname: JongHun
  orcidid: 0000-0002-4764-9371
  surname: Shin
  fullname: Shin, JongHun
  organization: Electronics and Telecommunications Research Institutes
– sequence: 3
  givenname: YoungKil
  orcidid: 0000-0003-4560-0141
  surname: Kim
  fullname: Kim, YoungKil
  organization: Electronics and Telecommunications Research Institutes
BackLink https://www.kci.go.kr/kciportal/ci/sereArticleSearch/ciSereArtiView.kci?sereArticleSearchBean.artiId=ART002771413$$DAccess content in National Research Foundation of Korea (NRF)
BookMark eNqFkc1LJDEQxYMoOH6cvfbVQ2s-Oun0UcTdHVAEHc-hkq52MnYnSzqD-N_b07NehMVTUVXv9yjqnZDDEAMScsHoVcWZvsac_OaKU05LKqQ-IAvOhShrwdUhWTDOZakqJY7JyThu6CSrpF6Qh2c_bPsMAeN2LAJuE_TFAG7tAxY5QRh7yD6G4t3ndQFFQh-6mBy2BeSMYd4N6NYQ_DickaMO-hHP_9VT8vLrbnX7p7x__L28vbkvXcUbUTaqbZGCdqpGKyUgcu0spR2rqRMWAKxrFKi2bjVXQLWeOl7b6WJmmWvFKbnc-4bUmTfnTQQ_19do3pK5eVotTaPrmspq0i732jbCxvxNfoD0MQPzIKZXAyl716NBZ5nExnauopXthKYopWstyrqx0InJS-69XIrjmLAzzuf5P9OrfG8YNbsszJyF2WVhdllM3PU37uuO_xNqT7z7Hj9-kpu71RNnXDRCfAK1PaJb
CitedBy_id crossref_primary_10_1007_s10489_023_04638_w
crossref_primary_10_1016_j_patrec_2023_06_001
crossref_primary_10_2478_amns_2023_2_01496
crossref_primary_10_1109_ACCESS_2023_3305927
crossref_primary_10_1007_s10015_024_00964_5
crossref_primary_10_3233_IDT_240657
Cites_doi 10.18653/v1/2020.iwslt-1.28
10.1162/neco.1997.9.8.1735
10.1007/BF00992696
10.1109/TNNLS.2019.2957276
10.4218/etrij.2017-0087
ContentType Journal Article
Copyright 1225‐6463/$ © 2021 ETRI
Copyright_xml – notice: 1225‐6463/$ © 2021 ETRI
DBID AAYXX
CITATION
DOA
ACYCR
DOI 10.4218/etrij.2020-0358
DatabaseName CrossRef
DOAJ Directory of Open Access Journals
Korean Citation Index
DatabaseTitle CrossRef
DatabaseTitleList


Database_xml – sequence: 1
  dbid: DOA
  name: DOAJ Directory of Open Access Journals
  url: https://www.doaj.org/
  sourceTypes: Open Website
DeliveryMethod fulltext_linktorsrc
Discipline Engineering
EISSN 2233-7326
EndPage 786
ExternalDocumentID oai_kci_go_kr_ARTI_9877054
oai_doaj_org_article_ecb15e9bfc404bf380e55cdbe579baf3
10_4218_etrij_2020_0358
ETR212393
Genre article
GrantInformation_xml – fundername: Institute for Information & communications Technology Promotion (IITP)
  funderid: R7119‐16‐1001
GroupedDBID -~X
.4S
.DC
.UV
0R~
1OC
29G
2WC
5GY
5VS
9ZL
AAKPC
AAYBS
ACGFS
ACXQS
ACYCR
ADBBV
ADDVE
AENEX
ALMA_UNASSIGNED_HOLDINGS
ARCSS
AVUZU
BCNDV
DU5
E3Z
EBS
EDO
EJD
GROUPED_DOAJ
IPNFZ
ITG
ITH
JDI
KQ8
KVFHK
MK~
ML~
O9-
OK1
P5Y
RIG
RNS
TR2
TUS
WIN
XSB
AAYXX
ADMLS
CITATION
OVT
AAMMB
AEFGJ
AGXDD
AIDQK
AIDYY
08R
ID FETCH-LOGICAL-c4293-96dde0a8c67eb55aee28cb00f170c3baaabc96a6d7d826a08896a27b4581b1cd3
IEDL.DBID DOA
ISSN 1225-6463
IngestDate Tue Nov 21 21:42:24 EST 2023
Wed Aug 27 01:31:22 EDT 2025
Tue Jul 01 02:03:20 EDT 2025
Thu Apr 24 22:53:20 EDT 2025
Wed Jan 22 16:28:29 EST 2025
IsDoiOpenAccess true
IsOpenAccess true
IsPeerReviewed true
IsScholarly true
Issue 5
Language English
License http://doi.wiley.com/10.1002/tdm_license_1.1
http://onlinelibrary.wiley.com/termsAndConditions#vor
LinkModel DirectLink
MergedId FETCHMERGED-LOGICAL-c4293-96dde0a8c67eb55aee28cb00f170c3baaabc96a6d7d826a08896a27b4581b1cd3
Notes Funding information
Institute for Information & communications Technology Promotion (IITP), Grant/Award Number: R7119‐16‐1001
https://doi.org/10.4218/etrij.2020-0358
ORCID 0000-0002-4764-9371
0000-0001-8015-9609
0000-0003-4560-0141
OpenAccessLink https://doaj.org/article/ecb15e9bfc404bf380e55cdbe579baf3
PageCount 12
ParticipantIDs nrf_kci_oai_kci_go_kr_ARTI_9877054
doaj_primary_oai_doaj_org_article_ecb15e9bfc404bf380e55cdbe579baf3
crossref_citationtrail_10_4218_etrij_2020_0358
crossref_primary_10_4218_etrij_2020_0358
wiley_primary_10_4218_etrij_2020_0358_ETR212393
ProviderPackageCode CITATION
AAYXX
PublicationCentury 2000
PublicationDate October 2021
2021-10-00
2021-10-01
2021-10
PublicationDateYYYYMMDD 2021-10-01
PublicationDate_xml – month: 10
  year: 2021
  text: October 2021
PublicationDecade 2020
PublicationTitle ETRI journal
PublicationYear 2021
Publisher Electronics and Telecommunications Research Institute (ETRI)
한국전자통신연구원
Publisher_xml – name: Electronics and Telecommunications Research Institute (ETRI)
– name: 한국전자통신연구원
References 1992; 8
2020; 1
2020; 31
2019; 41
2020
2019
2018
2017
2016
2015
2014
2002
1997; 9
Luong T. (e_1_2_9_3_1) 2015
Chen M. (e_1_2_9_28_1) 2018
Papineni K. (e_1_2_9_29_1) 2002
Vaswani A. (e_1_2_9_20_1) 2017
Gehring J. (e_1_2_9_19_1) 2017
Cho K. (e_1_2_9_4_1) 2016
Zhang B. (e_1_2_9_16_1) 2018
Sennrich R. (e_1_2_9_25_1) 2015
Kudo T. (e_1_2_9_26_1) 2018
e_1_2_9_15_1
Satija H. (e_1_2_9_8_1) 2016
e_1_2_9_17_1
Schneider F. (e_1_2_9_34_1) 2020
Alinejad A. (e_1_2_9_6_1) 2018
Kingma D. P. (e_1_2_9_27_1) 2015
Grissom A. (e_1_2_9_5_1) 2014
Wu Y. (e_1_2_9_18_1) 2016
e_1_2_9_22_1
e_1_2_9_21_1
Ma X. (e_1_2_9_30_1) 2019
Ma M. (e_1_2_9_7_1) 2019
Gu J. (e_1_2_9_9_1) 2016
Arivazhagan N. (e_1_2_9_13_1) 2019
Luo Y. (e_1_2_9_11_1) 2017
Zheng B. (e_1_2_9_14_1) 2019
Cherry C. (e_1_2_9_23_1) 2019
Jang E. (e_1_2_9_32_1) 2017
Shen T. (e_1_2_9_33_1) 2018
Chiu C. C. (e_1_2_9_31_1) 2017
Chen Y. (e_1_2_9_10_1) 2019
Bahdanau D. (e_1_2_9_2_1) 2015
Hou J. (e_1_2_9_24_1) 2020; 1
Raffel C. (e_1_2_9_12_1) 2017
References_xml – start-page: 1342
  year: 2014
  end-page: 1352
  article-title: Don't until the final verb wait: Reinforcement learning for simultaneous machine translation
  publication-title: Proc. Conf. Empir. Methods Nat. Lang. Process
– start-page: 1
  year: 2017
  end-page: 12
  article-title: Categorical reparametrization with gumble‐softmax
  publication-title: Proc. Int. Conf. Learn. Representations (Toulon, France)
– year: 2015
  article-title: Neural machine translation of rare words with subword units
  publication-title: arXiv Preprint, CoRR
– volume: 8
  start-page: 229
  issue: 3–4
  year: 1992
  end-page: 256
  article-title: Simple statistical gradient‐following algorithms for connectionist reinforcement learning
  publication-title: Mach. Learn.
– start-page: 3022
  year: 2018
  end-page: 3027
  article-title: Prediction improves simultaneous neural machine translation
  publication-title: Proc. Conf. Empir. Methods Nat. Lang. Process
– start-page: 2801
  year: 2017
  end-page: 2805
  article-title: Learning online alignments with continuous rewards policy gradient
  publication-title: Proc. IEEE Int. Conf. Acoust., Speech Signal Process. (ICASSP)
– year: 2019
  article-title: Monotonic infinite lookback attention for simultaneous machine translation
  publication-title: arXiv Preprint, CoRR
– year: 2017
  article-title: Attention is all you need
  publication-title: arXiv Preprint, CoRR
– year: 2019
  article-title: Thinking slow about latency evaluation for simultaneous machine translation
  publication-title: arXiv Preprint, CoRR
– volume: 31
  start-page: 4688
  issue: 11
  year: 2020
  end-page: 4698
  article-title: Neural machine translation with GRU‐gated attention model
  publication-title: IEEE Trans. Neural Netw. Learn. Syst.
– volume: 41
  start-page: 109
  issue: 1
  year: 2019
  end-page: 116
  article-title: Fast speaker adaptation using extended diagonal linear transformation for deep neural networks
  publication-title: ETRI J.
– volume: 1
  start-page: 1
  year: 2020
  end-page: 16
  article-title: Segment boundary detection directed attention for online end‐to‐end speech recognition
  publication-title: EURASIP J. Audio, Speech, Music Process.
– start-page: 1
  year: 2015
  end-page: 15
  article-title: Adam: A method for stochastic optimization
  publication-title: Proc. Int. Conf. Learn. Representations
– start-page: 110
  year: 2016
  end-page: 119
  article-title: Simultaneous machine translation using deep reinforcement learning
  publication-title: Proc. ICML 2016 Workshop Abstr. Reinf. Learn
– year: 2016
  article-title: Can neural machine translation do simultaneous translation?
  publication-title: arXiv Preprint, CoRR
– volume: 9
  start-page: 1735
  issue: 8
  year: 1997
  end-page: 1780
  article-title: Long short‐term memory
  publication-title: Neural Comput.
– start-page: 66
  year: 2018
  end-page: 71
  article-title: Sentencepiece: A simple and language independent subword tokenizer and detokenizer for neural text processing
  publication-title: Proc. Conf. Empir. Methods Nat. Lang. Process.
– start-page: 1
  year: 2015
  end-page: 15
  article-title: Neural machine translation by jointly learning to align and translate
  publication-title: Proc. Int. Conf. Learn. Representations
– year: 2016
  article-title: Learning to translate in real‐time with neural machine translation
  publication-title: arXiv Preprint, CoRR
– year: 2016
  article-title: Google's neural machine translation system: Bridging the gap between human and machine translation
  publication-title: arXiv Preprint, CoRR
– year: 2019
  article-title: Simultaneous translation with flexible policy via restricted imitation learning
  publication-title: arXiv Preprint, CoRR
– year: 2019
  article-title: STACL: Simultaneous translation with implicit anticipation and controllable latency using prefix‐to‐prefix framework
  publication-title: arXiv Preprint, CoRR
– year: 2018
  article-title: Accelerating neural transformer via an average attention network
  publication-title: arXiv Preprint, CoRR
– year: 2018
  article-title: The best of both worlds: Combining recent advances in neural machine translation
  publication-title: arXiv Preprint, CoRR
– start-page: 4345
  year: 2018
  end-page: 4352
  article-title: Reinforced self‐attention network: A hybrid of hard and soft attention for sequence modeling
  publication-title: Proc. Int. Joint Conf. Artif. Intell
– start-page: 1
  year: 2019
  end-page: 11
  article-title: Monotonic multihead attention
  publication-title: Proc. Int. Conf. Learn. Representations (Addis Ababa, Ethiopia)
– start-page: 311
  year: 2002
  end-page: 318
  article-title: Bleu: A method for automatic evaluation of machine translation
  publication-title: Proc. Assoc. Comput. Linguist
– start-page: 1
  year: 2017
  end-page: 16
  article-title: Monotonic chunkwise attention
  publication-title: Proc. Int. Conf. Learn. Representations (Vancouver, Canada)
– year: 2019
  article-title: How to do simultaneous translation better with consecutive neural machine translation?
  publication-title: arXiv Preprint, CoRR
– start-page: 1412
  year: 2015
  end-page: 1421
  article-title: Effective approaches to attention‐based neural machine translation
  publication-title: Proc. Empir. Methods Nat. Lang. Process
– start-page: 228
  year: 2020
  end-page: 326
  article-title: Towards stream translation: Adaptive computation time for simultaneous machine translation
  publication-title: Proc. Int. Conf. Spoken Lang. Transl.
– start-page: 2837
  year: 2017
  end-page: 2846
  article-title: Online and linear‐time attention by enforcing monotonic alignments
  publication-title: Proc. Int. Conf. Mach. Learn
– start-page: 1243
  year: 2017
  end-page: 1252
  article-title: Convolutional sequence to sequence learning
  publication-title: Proc. Int. Conf. Mach. Learn
– start-page: 2801
  year: 2017
  ident: e_1_2_9_11_1
  article-title: Learning online alignments with continuous rewards policy gradient
  publication-title: Proc. IEEE Int. Conf. Acoust., Speech Signal Process. (ICASSP)
– start-page: 3022
  year: 2018
  ident: e_1_2_9_6_1
  article-title: Prediction improves simultaneous neural machine translation
  publication-title: Proc. Conf. Empir. Methods Nat. Lang. Process
– year: 2019
  ident: e_1_2_9_13_1
  article-title: Monotonic infinite lookback attention for simultaneous machine translation
  publication-title: arXiv Preprint, CoRR
– year: 2019
  ident: e_1_2_9_14_1
  article-title: Simultaneous translation with flexible policy via restricted imitation learning
  publication-title: arXiv Preprint, CoRR
– start-page: 110
  year: 2016
  ident: e_1_2_9_8_1
  article-title: Simultaneous machine translation using deep reinforcement learning
  publication-title: Proc. ICML 2016 Workshop Abstr. Reinf. Learn
– start-page: 4345
  year: 2018
  ident: e_1_2_9_33_1
  article-title: Reinforced self‐attention network: A hybrid of hard and soft attention for sequence modeling
  publication-title: Proc. Int. Joint Conf. Artif. Intell
– start-page: 228
  year: 2020
  ident: e_1_2_9_34_1
  article-title: Towards stream translation: Adaptive computation time for simultaneous machine translation
  publication-title: Proc. Int. Conf. Spoken Lang. Transl.
  doi: 10.18653/v1/2020.iwslt-1.28
– year: 2017
  ident: e_1_2_9_20_1
  article-title: Attention is all you need
  publication-title: arXiv Preprint, CoRR
– start-page: 2837
  year: 2017
  ident: e_1_2_9_12_1
  article-title: Online and linear‐time attention by enforcing monotonic alignments
  publication-title: Proc. Int. Conf. Mach. Learn
– year: 2015
  ident: e_1_2_9_25_1
  article-title: Neural machine translation of rare words with subword units
  publication-title: arXiv Preprint, CoRR
– ident: e_1_2_9_21_1
  doi: 10.1162/neco.1997.9.8.1735
– year: 2019
  ident: e_1_2_9_10_1
  article-title: How to do simultaneous translation better with consecutive neural machine translation?
  publication-title: arXiv Preprint, CoRR
– start-page: 1342
  year: 2014
  ident: e_1_2_9_5_1
  article-title: Don't until the final verb wait: Reinforcement learning for simultaneous machine translation
  publication-title: Proc. Conf. Empir. Methods Nat. Lang. Process
– ident: e_1_2_9_22_1
  doi: 10.1007/BF00992696
– year: 2018
  ident: e_1_2_9_28_1
  article-title: The best of both worlds: Combining recent advances in neural machine translation
  publication-title: arXiv Preprint, CoRR
– year: 2018
  ident: e_1_2_9_16_1
  article-title: Accelerating neural transformer via an average attention network
  publication-title: arXiv Preprint, CoRR
– start-page: 1
  year: 2017
  ident: e_1_2_9_32_1
  article-title: Categorical reparametrization with gumble‐softmax
  publication-title: Proc. Int. Conf. Learn. Representations (Toulon, France)
– start-page: 1412
  year: 2015
  ident: e_1_2_9_3_1
  article-title: Effective approaches to attention‐based neural machine translation
  publication-title: Proc. Empir. Methods Nat. Lang. Process
– start-page: 66
  year: 2018
  ident: e_1_2_9_26_1
  article-title: Sentencepiece: A simple and language independent subword tokenizer and detokenizer for neural text processing
  publication-title: Proc. Conf. Empir. Methods Nat. Lang. Process.
– start-page: 1
  year: 2015
  ident: e_1_2_9_27_1
  article-title: Adam: A method for stochastic optimization
  publication-title: Proc. Int. Conf. Learn. Representations
– start-page: 1243
  year: 2017
  ident: e_1_2_9_19_1
  article-title: Convolutional sequence to sequence learning
  publication-title: Proc. Int. Conf. Mach. Learn
– ident: e_1_2_9_15_1
  doi: 10.1109/TNNLS.2019.2957276
– start-page: 1
  year: 2015
  ident: e_1_2_9_2_1
  article-title: Neural machine translation by jointly learning to align and translate
  publication-title: Proc. Int. Conf. Learn. Representations
– year: 2016
  ident: e_1_2_9_9_1
  article-title: Learning to translate in real‐time with neural machine translation
  publication-title: arXiv Preprint, CoRR
– year: 2016
  ident: e_1_2_9_18_1
  article-title: Google's neural machine translation system: Bridging the gap between human and machine translation
  publication-title: arXiv Preprint, CoRR
– start-page: 1
  year: 2019
  ident: e_1_2_9_30_1
  article-title: Monotonic multihead attention
  publication-title: Proc. Int. Conf. Learn. Representations (Addis Ababa, Ethiopia)
– year: 2019
  ident: e_1_2_9_7_1
  article-title: STACL: Simultaneous translation with implicit anticipation and controllable latency using prefix‐to‐prefix framework
  publication-title: arXiv Preprint, CoRR
– start-page: 1
  year: 2017
  ident: e_1_2_9_31_1
  article-title: Monotonic chunkwise attention
  publication-title: Proc. Int. Conf. Learn. Representations (Vancouver, Canada)
– year: 2016
  ident: e_1_2_9_4_1
  article-title: Can neural machine translation do simultaneous translation?
  publication-title: arXiv Preprint, CoRR
– ident: e_1_2_9_17_1
  doi: 10.4218/etrij.2017-0087
– start-page: 311
  year: 2002
  ident: e_1_2_9_29_1
  article-title: Bleu: A method for automatic evaluation of machine translation
  publication-title: Proc. Assoc. Comput. Linguist
– volume: 1
  start-page: 1
  year: 2020
  ident: e_1_2_9_24_1
  article-title: Segment boundary detection directed attention for online end‐to‐end speech recognition
  publication-title: EURASIP J. Audio, Speech, Music Process.
– year: 2019
  ident: e_1_2_9_23_1
  article-title: Thinking slow about latency evaluation for simultaneous machine translation
  publication-title: arXiv Preprint, CoRR
SSID ssj0020458
Score 2.2911124
Snippet To translate in real time, a simultaneous translation system should determine when to stop reading source tokens and generate target tokens corresponding to a...
AbstractTo translate in real time, a simultaneous translation system should determine when to stop reading source tokens and generate target tokens...
SourceID nrf
doaj
crossref
wiley
SourceType Open Website
Enrichment Source
Index Database
Publisher
StartPage 775
SubjectTerms attention mechanism
neural network
reinforcement learning
simultaneous machine translation
전자/정보통신공학
Title Simultaneous neural machine translation with a reinforced attention mechanism
URI https://onlinelibrary.wiley.com/doi/abs/10.4218%2Fetrij.2020-0358
https://doaj.org/article/ecb15e9bfc404bf380e55cdbe579baf3
https://www.kci.go.kr/kciportal/ci/sereArticleSearch/ciSereArtiView.kci?sereArticleSearchBean.artiId=ART002771413
Volume 43
hasFullText 1
inHoldings 1
isFullTextHit
isPrint
ispartofPNX ETRI Journal, 2021, 43(5), , pp.775-786
link http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwrV1Nb9QwELVQT-WA-BTLlyzEgUtYJ_6Kj4BatUjlAK3Um2VPxmgpm1bL8v-ZcdKqPVS99BQpiqPR8yTjlzy_EeKD0hZUV3TjAkJjbNZN3yXXBPCD8cEOpvB-56Pv7uDEfDu1p9dafbEmbLIHnoBbIuTWYsgFjDK56F6htTBktD7kVKrPpwrqkkzNVIt__zHVomxtnHF6MvUxVM-W3KjqNxFDok0UaH-jHlXbfqoy46bcXKzWarP_WDyal4ny8xTeE_EAx6fi4TXzwGfi6OeK1YBpRCLvkn0pacC6aiNRbrkETTI3yZ9aZZIbrCapgINkT82qcpRr5J2_q7_r5-Jkf-_460EzN0dogEqIboKjF5NKPTiP2dqE2PVAz1BpvQKdU0oZgktu8AMxiMRqJpc6nwmYNrcw6BdiZzwf8aWQCovtXM7YDs4A9Ing85BMdnTX3JWF-HQJUYTZOZwbWPyJxCAY01gxjYxpZEwX4uPVgIvJNOP2S78w5leXsdt1PUE5EOcciHflwEK8pxmLZ7Cq4_n46zyebSJxgsMYeu9pYboQyzqhd0UU945_cD0P-tV9xPZa7HYsg6n6vzdiZ7v5h29pHbPN72rK_gf39PGX
linkProvider Directory of Open Access Journals
linkToHtml http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwlV1Lb9QwELZgewAOiKdYnhbiwCU0iV_xsaBWW-j2ALuo4mLZzrhaymZRWP4_M05YtUgIcYoUZZzos8cznzP-zNirUqhY1kkU2kIspAqiaGqvCxtNK41VrUy033l-qmdL-f5MnV3aCzPoQ-wW3Mgz8nxNDk4L0uTlEsMS9eK2X31Fhof8B9_YXGd7lNvUE7Z38Hn5ZbmjXfQrkGgXjtxCSy0GgR9qZP-PJq7EpizhjxGn69PVxDVHnqM77PaYMvKDoY_vsmvQ3WO3LgkJ3mfzTyuqDPQdIJHnpFGJButcJwl8S-FoKHnjtOzKPe8hC6ZGaDnpa-aKR74G2gW8-rF-wJZHh4t3s2I8KKGIGE5EYTVOUqVvojYQlPIAdRPRn1JlyiiC9z5Eq71uTYtswlNlk_a1CQhMFarYiods0m06eMR4CUnVOgSoWi1jbDzCZ6KXQWOroU5T9uY3RC6OKuJ0mMU3h2yCMHUZU0eYOsJ0yl7vDL4PAhp_f_QtYb57jJSv841Nf-5GR3IQQ6XAhhRlKUMSTQlKxTaAMjb4JKbsJfaYu4irbE_X84276B3yg2NnG2MwSZ2y_dyh__oid7j4SLHdisf_bfGC3Zgt5ifu5Pj0wxN2s6ZamFwE-JRNtv1PeIbJzDY8H0frLzgI8DA
linkToPdf http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwlV1Lb9QwELaglRAcUHmpC7REiAOX0CR-xccCXbVAKwRdhLhYfoyrpWy2Csv_Z8ZJVxSpQpwiRRknmvF4_DmfPzP2ouIyVE3ipTIQSiE9L9vGqdIEHYU2MopE-52PT9ThTLz7Ki_ZhLQXZtCHWC-4UWbk8ZoS_CImSnKBVYmCuOrn3xHgIfzBF7Y32SZp5WFX39z_Mvs2W6Mu-hNIqAs7bqmE4oO-DzWy91cTV0pTVvDHgtP16eq8NRee6Ra7O84Yi_0hxPfYDejuszt_6Ag-YMef50QMdB0gji9IohINFpkmCcWKqtHAeCto1bVwRQ9ZLzVALEheMxMeiwXQJuD5z8VDNpsenL45LMdzEsqA1YSXRuEYVbk2KA1eSgfQtAHTKdW6Ctw753wwyqmoI4IJR8Qm5Rrt0TG1r0Pkj9hGt-xgmxUVJNko76GOSoTQOnSfDk54ha36Jk3Yq0sX2TCKiNNZFj8sggnyqc0-teRTSz6dsJdrg4tBP-P6R1-Tz9ePkfB1vrHsz-yYRxaCryUYn4KohE-8rUDKED1IbbxLfMKeY8TseZhne7qeLe15bxEeHFnTao1z1AnbywH91xfZg9NPVNoNf_zfFs_YrY9vp_bD0cn7J-x2Q0yYTAF8yjZW_S_YwanMyu-OnfU3yOHvWQ
openUrl ctx_ver=Z39.88-2004&ctx_enc=info%3Aofi%2Fenc%3AUTF-8&rfr_id=info%3Asid%2Fsummon.serialssolutions.com&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=article&rft.atitle=Simultaneous+neural+machine+translation+with+a+reinforced+attention+mechanism&rft.jtitle=ETRI+journal&rft.au=Lee%2C+YoHan&rft.au=Shin%2C+JongHun&rft.au=Kim%2C+YoungKil&rft.date=2021-10-01&rft.issn=1225-6463&rft.eissn=2233-7326&rft.volume=43&rft.issue=5&rft.spage=775&rft.epage=786&rft_id=info:doi/10.4218%2Fetrij.2020-0358&rft.externalDBID=n%2Fa&rft.externalDocID=10_4218_etrij_2020_0358
thumbnail_l http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/lc.gif&issn=1225-6463&client=summon
thumbnail_m http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/mc.gif&issn=1225-6463&client=summon
thumbnail_s http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/sc.gif&issn=1225-6463&client=summon