CN-AutoMIC: Distilling Chinese Commonsense Knowledge from Pretrained Language Models
文献类型:会议论文
作者 | Wang, Chenhao1,2![]() ![]() ![]() ![]() |
出版日期 | 2022 |
会议日期 | 2022-12 |
会议地点 | Abu Dhabi, United Arab Emirates |
英文摘要 | Commonsense knowledge graphs (CKGs) are increasingly applied in various natural language processing tasks. However, most existing CKGs are limited to English, which hinders related research in non-English languages. Meanwhile, directly generating commonsense knowledge from pretrained language models has recently received attention, yet it has not been explored in non-English languages. In this paper, we propose a large-scale Chinese CKG generated from multilingual PLMs, named as **CN-AutoMIC**, aiming to fill the research gap of non-English CKGs. To improve the efficiency, we propose generate-by-category strategy to reduce invalid generation. To ensure the filtering quality, we develop cascaded filters to discard low-quality results. To further increase the diversity and density, we introduce a bootstrapping iteration process to reuse generated results. Finally, we conduct detailed analyses on CN-AutoMIC from different aspects. Empirical results show the proposed CKG has high quality and diversity, surpassing the direct translation version of similar English CKGs. We also find some interesting deficiency patterns and differences between relations, which reveal pending problems in commonsense knowledge generation. We share the resources and related models for further study. |
源URL | [http://ir.ia.ac.cn/handle/173211/56699] ![]() |
专题 | 复杂系统认知与决策实验室 |
作者单位 | 1.School of Artificial Intelligence, University of Chinese Academy of Sciences 2.National Laboratory of Pattern Recognition, Institute of Automation 3.Beijing Academy of Artificial Intelligence |
推荐引用方式 GB/T 7714 | Wang, Chenhao,Li, Jiachun,Chen, Yubo,et al. CN-AutoMIC: Distilling Chinese Commonsense Knowledge from Pretrained Language Models[C]. 见:. Abu Dhabi, United Arab Emirates. 2022-12. |
入库方式: OAI收割
来源:自动化研究所
其他版本
除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。