中国科学院机构知识库网格
Chinese Academy of Sciences Institutional Repositories Grid
Improving Low-Resource Neural Machine Translation With Teacher-Free Knowledge Distillation

文献类型:期刊论文

作者Zhang, XL (Zhang, Xinlu); Li, X (Li, Xiao); Yang, YT (Yang, Yating); Dong, R (Dong, Rui)
刊名IEEE ACCESS
出版日期2020
卷号8期号:11页码:206638-206645
关键词Training Decoding Vocabulary Task analysis Standards Knowledge engineering Computationalmodeling Neuralmachine translation knowledgedistillation prior knowledge作者信息
ISSN号2169-3536
DOI10.1109/ACCESS.2020.3037821
英文摘要

Knowledge Distillation (KD) aims to distill the knowledge of a cumbersome teacher model into a lightweight student model. Its success is generally attributed to the privileged information on similarities among categories provided by the teacher model, and in this sense, only strong teacher models are deployed to teach weaker students in practice. However, in low-resource neural machine translation, a stronger teacher model is not available. To counteract this, We therefore propose a novel Teacher-free Knowledge Distillation framework for low-resource neural machine translation, where the model learns from manually designed regularization distribution as a virtual teacher model. The prior distribution of artificial design can not only obtain the similarity information between words, but also provide effective regularity for model training. Experimental results show that the proposed method has improved performance in low-resource language effectively.

WOS记录号WOS:000594434200001
源URL[http://ir.xjipc.cas.cn/handle/365002/7882]  
专题新疆理化技术研究所_多语种信息技术研究室
中国科学院新疆理化技术研究所
作者单位Chinese Acad Sci, Xinjiang Tech Inst Phys & Chem, Urumqi 830011, Peoples R China
推荐引用方式
GB/T 7714
Zhang, XL ,Li, X ,Yang, YT ,et al. Improving Low-Resource Neural Machine Translation With Teacher-Free Knowledge Distillation[J]. IEEE ACCESS,2020,8(11):206638-206645.
APA Zhang, XL ,Li, X ,Yang, YT ,&Dong, R .(2020).Improving Low-Resource Neural Machine Translation With Teacher-Free Knowledge Distillation.IEEE ACCESS,8(11),206638-206645.
MLA Zhang, XL ,et al."Improving Low-Resource Neural Machine Translation With Teacher-Free Knowledge Distillation".IEEE ACCESS 8.11(2020):206638-206645.

入库方式: OAI收割

来源:新疆理化技术研究所

浏览0
下载0
收藏0
其他版本

除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。