NCLS: Neural Cross-Lingual Summarization
文献类型:会议论文
作者 | Zhu JN(朱军楠)1,2; Wang Q(王迁)1,2; Wang YN(王亦宁)1,2; Zhou Y(周玉)1,2; Zhang JJ(张家俊)1,2; Wang SN(王少楠)1,2; Zong CQ(宗成庆)1,2,3; Zhou, Yu![]() ![]() ![]() |
出版日期 | 2019-11 |
会议日期 | 2019.11.3-2019.11.7 |
会议地点 | Hong Kong, China |
英文摘要 | Cross-lingual summarization (CLS) is the task to produce a summary in one particular language for a source document in a different language. Existing methods simply divide this task into two steps: summarization and translation, leading to the problem of error propagation. To handle that, we present an end-to-end CLS framework, which we refer to as Neural Cross-Lingual Summarization (NCLS), for the first time. Moreover, we propose to further improve NCLS by incorporating two related tasks, monolingual summarization and machine translation, into the training process of CLS under multi-task learning. Due to the lack of supervised CLS data, we propose a round-trip translation strategy to acquire two high-quality large-scale CLS datasets based on existing monolingual summarization datasets. Experimental results have shown that our NCLS achieves remarkable improvement over traditional pipeline methods on both English-to-Chinese and Chinese-to-English CLS human-corrected test sets. In addition, NCLS with multi-task learning can further significantly improve the quality of generated summaries. We make our dataset and code publicly available here: http://www.nlpr.ia.ac.cn/cip/dataset.htm. |
源文献作者 | Association for Computational Linguistics |
会议录出版者 | Association for Computational Linguistics |
语种 | 英语 |
源URL | [http://ir.ia.ac.cn/handle/173211/39083] ![]() |
专题 | 模式识别国家重点实验室_自然语言处理 |
通讯作者 | Zhou Y(周玉); Zhou, Yu |
作者单位 | 1.National Laboratory of Pattern Recognition, Institute of Automation, CAS 2.University of Chinese Academy of Sciences 3.CAS Center for Excellence in Brain Science and Intelligence Technology |
推荐引用方式 GB/T 7714 | Zhu JN,Wang Q,Wang YN,et al. NCLS: Neural Cross-Lingual Summarization[C]. 见:. Hong Kong, China. 2019.11.3-2019.11.7. |
入库方式: OAI收割
来源:自动化研究所
其他版本
除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。