中国科学院机构知识库网格
Chinese Academy of Sciences Institutional Repositories Grid
Norm-based Noisy Corpora Filtering and Refurbishing in Neural Machine Translation

文献类型:会议论文

作者Yu, Lu1,2; Jiajun, Zhang1,2
出版日期2022-12
会议日期2022-12
会议地点线上
关键词神经机器翻译
英文摘要

Recent advances in neural machine translation depend on massive parallel corpora, which are collected from any open source without much guarantee of quality. It stresses the need for noisy corpora filtering, but existing methods are insufficient to solve this issue. They spend much time ensembling multiple scorers trained on clean bitexts, unavailable for low-resource languages in practice. In this paper, we propose a norm-based noisy corpora filtering and refurbishing method with no external data and costly scorers. The noisy and clean samples are separated based on how much information from the source and target sides the model requires to fit the given translation. For the unparallel sentence, the target-side history translation is much more important than the source context, contrary to the parallel ones. The amount of these two information flows can be measured by norms of source-/target-side context vectors. Moreover, we propose to reuse the discovered noisy data by generating pseudo labels via online knowledge distillation. Extensive experiments show that our proposed filtering method performs comparably with state-of-the-art noisy corpora filtering techniques but is more efficient and easier to operate. Noisy sample refurbishing further enhances the performance by making the most of the given data.

学科主题计算机科学技术 ; 人工智能
语种英语
URL标识查看原文
源URL[http://ir.ia.ac.cn/handle/173211/51838]  
专题模式识别国家重点实验室_自然语言处理
通讯作者Jiajun, Zhang
作者单位1.School of Artificial Intelligence, University of Chinese Academy of Sciences
2.National Laboratory of Pattern Recognition, Institute of Automation, Chinese Academy of Sciences
推荐引用方式
GB/T 7714
Yu, Lu,Jiajun, Zhang. Norm-based Noisy Corpora Filtering and Refurbishing in Neural Machine Translation[C]. 见:. 线上. 2022-12.

入库方式: OAI收割

来源:自动化研究所

浏览0
下载0
收藏0
其他版本

除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。