中国科学院机构知识库网格
Chinese Academy of Sciences Institutional Repositories Grid
Effectively training neural machine translation models with monolingual data

文献类型:期刊论文

作者Yang, Zhen1,2; Chen, Wei2; Wang, Feng2; Xu, Bo2
刊名NEUROCOMPUTING
出版日期2019-03-14
卷号333页码:240-247
关键词Neural machine translation Monolingual data Gate-enhanced Source-side and target-side Effectively
ISSN号0925-2312
DOI10.1016/j.neucom.2018.12.032
通讯作者Yang, Zhen(yangzhen2014@ia.ac.cn)
英文摘要Improving neural machine translation models (NMT) with monolingual data has aroused more and more interests in this area and back-translation for monolingual data augmentation Sennrich et al. (2016) has been taken as a promising development recently. While the naive back-translation approach improves the translation performance substantially, we notice that its usage for monolingual data is not so effective because traditional NMT models make no distinction between the true parallel corpus and the back translated synthetic parallel corpus. This paper proposes a gate-enhanced NMT model which makes use of monolingual data more effectively. The central idea is to separate the data flow of monolingual data and parallel data into different channels by the elegant designed gate, which enables the model to perform different transformations according to the type of the input sequence, i.e., monolingual data and parallel data. Experiments on Chinese-English and English-German translation tasks show that our approach achieves substantial improvements over strong baselines and the gate-enhanced NMT model can utilize the source-side and target-side monolingual data at the same time. (C) 2018 Elsevier B.V. All rights reserved.
WOS关键词NETWORK
资助项目National Program on Key Basic Research Project of China (973 Program)[2013CB329302]
WOS研究方向Computer Science
语种英语
WOS记录号WOS:000456834100022
出版者ELSEVIER SCIENCE BV
资助机构National Program on Key Basic Research Project of China (973 Program)
源URL[http://ir.ia.ac.cn/handle/173211/25313]  
专题数字内容技术与服务研究中心_听觉模型与认知计算
通讯作者Yang, Zhen
作者单位1.Univ Chinese Acad Sci, Beijing, Peoples R China
2.Chinese Acad Sci, Inst Automat, 95 ZhongGuanCun East Rd, Beijing 100190, Peoples R China
推荐引用方式
GB/T 7714
Yang, Zhen,Chen, Wei,Wang, Feng,et al. Effectively training neural machine translation models with monolingual data[J]. NEUROCOMPUTING,2019,333:240-247.
APA Yang, Zhen,Chen, Wei,Wang, Feng,&Xu, Bo.(2019).Effectively training neural machine translation models with monolingual data.NEUROCOMPUTING,333,240-247.
MLA Yang, Zhen,et al."Effectively training neural machine translation models with monolingual data".NEUROCOMPUTING 333(2019):240-247.

入库方式: OAI收割

来源:自动化研究所

浏览0
下载0
收藏0
其他版本

除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。