中国科学院机构知识库网格
Chinese Academy of Sciences Institutional Repositories Grid
Multi-Task Neural Model for Agglutinative Language Translation

文献类型:会议论文

作者Pan, YR (Pan, Yirong) 1 , 2 , 3; Li, X (Li, Xiao) 1 , 2 , 3; Yang, YT (Yang, Yating) 1 , 2 , 3; Dong, R (Dong, Rui) 1 , 2 , 3
出版日期2020
会议日期JUL 05-10, 2020
会议地点ELECTR NETWORK
英文摘要

Neural machine translation (NMT) has achieved impressive performance recently by using large-scale parallel corpora. However, it struggles in the low-resource and morphologically-rich scenarios of agglutinative language translation task. Inspired by the finding that monolingual data can greatly improve the NMT performance, we propose a multi-task neural model that jointly learns to perform bi-directional translation and agglutinative language stemming Our approach employs the shared encoder and decoder to train a single model without changing the standard NMT architecture but instead adding a token before each source-side sentence to specify the desired target outputs of the two different tasks. Experimental results on Turkish-English and Uyghur-Chinese show that our proposed approach can significantly improve the translation performance on agglutinative languages by using a small amount of monolingual data.

会议录ASSOC COMPUTATIONAL LINGUISTICS-ACL209 N EIGHTH STREET, STROUDSBURG, PA 18360 USA
ISSN号978-1-952148-03-3
源URL[http://ir.xjipc.cas.cn/handle/365002/7880]  
专题新疆理化技术研究所_多语种信息技术研究室
中国科学院新疆理化技术研究所
作者单位1.Xinjiang Lab Minor Speech & Language Informat Pro, Urumqi, Peoples R China
2.Univ Chinese Acad Sci, Beijing, Peoples R China
3.Chinese Acad Sci, Xinjiang Tech Inst Phys & Chem, Beijing, Peoples R China
推荐引用方式
GB/T 7714
Pan, YR ,Li, X ,Yang, YT ,et al. Multi-Task Neural Model for Agglutinative Language Translation[C]. 见:. ELECTR NETWORK. JUL 05-10, 2020.

入库方式: OAI收割

来源:新疆理化技术研究所

浏览0
下载0
收藏0
其他版本

除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。