中国科学院机构知识库网格
Chinese Academy of Sciences Institutional Repositories Grid
A Compact and Language-Sensitive Multilingual Translation Method

文献类型:会议论文

作者Wang,Yining1,3; Zhou,Long1,3; Zhang,Jiajun1,3; Zhai,Feifei4; Xu,Jingfang4; Zong,Chengqing1,2,3
出版日期2019-07
会议日期July 28 - August 2, 2019
会议地点Florence, Italy
英文摘要

Multilingual neural machine translation (Multi-NMT) with one encoder-decoder model has made remarkable progress due to its simple deployment. However, this multilingual translation paradigm does not make full use of language commonality and parameter sharing between encoder and decoder. Furthermore, this kind of paradigm cannot outperform the individual models trained on bilingual corpus in most cases. In this paper, we propose a compact and language-sensitive
method for multilingual translation. To maximize parameter sharing, we first present a universal representor to replace both encoder and decoder models. To make the representor sensitive for specific languages, we further introduce language-sensitive embedding, attention, and discriminator with the ability to enhance model performance. We verify our methods on various translation scenarios, including one-to-many, many-to-many and zero-shot. Extensive experiments demonstrate that our proposed methods remarkably outperform strong standard multilingual translation systems on WMT and IWSLT datasets. Moreover, we find that our model is especially helpful in low-resource and zero-shot translation scenarios.

源URL[http://ir.ia.ac.cn/handle/173211/39233]  
专题模式识别国家重点实验室_自然语言处理
通讯作者Zhang,Jiajun
作者单位1.National Laboratory of Pattern Recognition, CASIA, Beijing, China
2.CAS Center for Excellence in Brain Science and Intelligence Technology, Beijing, China
3.University of Chinese Academy of Sciences, Beijing, China
4.Sogou Inc., Beijing, China
推荐引用方式
GB/T 7714
Wang,Yining,Zhou,Long,Zhang,Jiajun,et al. A Compact and Language-Sensitive Multilingual Translation Method[C]. 见:. Florence, Italy. July 28 - August 2, 2019.

入库方式: OAI收割

来源:自动化研究所

浏览0
下载0
收藏0
其他版本

除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。