Improving Low-Resource Neural Machine Translation With Teacher-Free Knowledge Distillation
文献类型:期刊论文
作者 | Zhang, XL (Zhang, Xinlu); Li, X (Li, Xiao); Yang, YT (Yang, Yating)![]() |
刊名 | IEEE ACCESS
![]() |
出版日期 | 2020 |
卷号 | 8期号:11页码:206638-206645 |
关键词 | Training Decoding Vocabulary Task analysis Standards Knowledge engineering Computationalmodeling Neuralmachine translation knowledgedistillation prior knowledge作者信息 |
ISSN号 | 2169-3536 |
DOI | 10.1109/ACCESS.2020.3037821 |
英文摘要 | Knowledge Distillation (KD) aims to distill the knowledge of a cumbersome teacher model into a lightweight student model. Its success is generally attributed to the privileged information on similarities among categories provided by the teacher model, and in this sense, only strong teacher models are deployed to teach weaker students in practice. However, in low-resource neural machine translation, a stronger teacher model is not available. To counteract this, We therefore propose a novel Teacher-free Knowledge Distillation framework for low-resource neural machine translation, where the model learns from manually designed regularization distribution as a virtual teacher model. The prior distribution of artificial design can not only obtain the similarity information between words, but also provide effective regularity for model training. Experimental results show that the proposed method has improved performance in low-resource language effectively. |
WOS记录号 | WOS:000594434200001 |
源URL | [http://ir.xjipc.cas.cn/handle/365002/7882] ![]() |
专题 | 新疆理化技术研究所_多语种信息技术研究室 中国科学院新疆理化技术研究所 |
作者单位 | Chinese Acad Sci, Xinjiang Tech Inst Phys & Chem, Urumqi 830011, Peoples R China |
推荐引用方式 GB/T 7714 | Zhang, XL ,Li, X ,Yang, YT ,et al. Improving Low-Resource Neural Machine Translation With Teacher-Free Knowledge Distillation[J]. IEEE ACCESS,2020,8(11):206638-206645. |
APA | Zhang, XL ,Li, X ,Yang, YT ,&Dong, R .(2020).Improving Low-Resource Neural Machine Translation With Teacher-Free Knowledge Distillation.IEEE ACCESS,8(11),206638-206645. |
MLA | Zhang, XL ,et al."Improving Low-Resource Neural Machine Translation With Teacher-Free Knowledge Distillation".IEEE ACCESS 8.11(2020):206638-206645. |
入库方式: OAI收割
来源:新疆理化技术研究所
其他版本
除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。