中国科学院机构知识库网格
Chinese Academy of Sciences Institutional Repositories Grid
Sequence-Level Training for Non-Autoregressive Neural Machine Translation

文献类型:期刊论文

作者Shao, Chenze2; Feng, Yang2; Zhang, Jinchao1; Meng, Fandong1; Zhou, Jie1
刊名COMPUTATIONAL LINGUISTICS
出版日期2021-12-01
卷号47期号:4页码:891-925
ISSN号0891-2017
DOI10.1162/COLI_a_00421
英文摘要In recent years, Neural Machine Translation (NMT) has achieved notable results in various translation tasks. However, the word-by-word generation manner determined by the autoregressive mechanism leads to high translation latency of the NMT and restricts its low-latency applications. Non-Autoregressive Neural Machine Translation (NAT) removes the autoregressive mechanism and achieves significant decoding speedup by generating target words independently and simultaneously. Nevertheless, NAT still takes the word-level cross-entropy loss as the training objective, which is not optimal because the output of NAT cannot be properly evaluated due to the multimodality problem. In this article, we propose using sequence-level training objectives to train NAT models, which evaluate the NAT outputs as a whole and correlates well with the real translation quality. First, we propose training NAT models to optimize sequence-level evaluation metrics (e.g., BLEW based on several novel reinforcement algorithms customized for NAT, which outperform the conventional method by reducing the variance of gradient estimation. Second, we introduce a novel training objective for NAT models, which aims to minimize the Bag-of-N-grams (BoN) difference between the model output and the reference sentence. The BoN training objective is differentiable and can be calculated efficiently without doing any approximations. Finally, we apply a three-stage training strategy to combine these two methods to train the NAT model. We validate our approach on four translation tasks (WMT14 EN <-> De, WMT16 EN <-> Ro), which shows that our approach largely outperforms NAT baselines and achieves remarkable performance on all translation tasks. The source code is available at https://github.com/ictnlp/Seq-NAT.
WOS研究方向Computer Science ; Linguistics
语种英语
出版者MIT PRESS
WOS记录号WOS:000753228200006
源URL[http://119.78.100.204/handle/2XEOYT63/19008]  
专题中国科学院计算技术研究所期刊论文_英文
通讯作者Shao, Chenze
作者单位1.Tencent Inc, WeChat AI, Pattern Recognit Ctr, Shenzhen, Peoples R China
2.Chinese Acad Sci, Inst Comp Technol, Key Lab Intelligent Informat Proc, Beijing, Peoples R China
推荐引用方式
GB/T 7714
Shao, Chenze,Feng, Yang,Zhang, Jinchao,et al. Sequence-Level Training for Non-Autoregressive Neural Machine Translation[J]. COMPUTATIONAL LINGUISTICS,2021,47(4):891-925.
APA Shao, Chenze,Feng, Yang,Zhang, Jinchao,Meng, Fandong,&Zhou, Jie.(2021).Sequence-Level Training for Non-Autoregressive Neural Machine Translation.COMPUTATIONAL LINGUISTICS,47(4),891-925.
MLA Shao, Chenze,et al."Sequence-Level Training for Non-Autoregressive Neural Machine Translation".COMPUTATIONAL LINGUISTICS 47.4(2021):891-925.

入库方式: OAI收割

来源:计算技术研究所

浏览0
下载0
收藏0
其他版本

除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。