中国科学院机构知识库网格
Chinese Academy of Sciences Institutional Repositories Grid
Enhancing text generation from knowledge graphs with cross-structure attention distillation

文献类型:期刊论文

作者Shi, Xiayang1; Xia, Zhenlin1; Cheng, Pei1; Li, Yinlin2
刊名ENGINEERING APPLICATIONS OF ARTIFICIAL INTELLIGENCE
出版日期2024-10-01
卷号136页码:11
关键词KG-to-text Pre-trained models Knowledge distillation
ISSN号0952-1976
DOI10.1016/j.engappai.2024.108971
通讯作者Shi, Xiayang(aryang123@163.com)
英文摘要Existing Large-scale pre-trained language models (PLMs) can effectively enhance the knowledge-graph-to-text (KG-to-text) generation by processing the linearized version of a graph. However, recent work ignores the effective interaction between the linear representation of the knowledge graphs and the generated texts. To address this problem, we propose a distillation model that utilizes a cross-structure attention mechanism for generating texts from knowledge graphs. This mechanism is designed to obtain rich contextual semantic representations between linearized structured data and text information in the teacher model. And then, we train a student model to mimic the behavior of the teacher model by outputting logits with online distillation. Experimental results demonstrate that our distillation model outperforms many pre-trained natural language generation (NLG) models on various KG-to-text datasets.
资助项目Transport Technology Project of Henan Province[2021G5] ; Foundation and Cutting-Edge Technolo-gies Research Program of Henan Province[242102210035] ; Foundation and Cutting-Edge Technolo-gies Research Program of Henan Province[24210221 0100]
WOS研究方向Automation & Control Systems ; Computer Science ; Engineering
语种英语
WOS记录号WOS:001274655900001
出版者PERGAMON-ELSEVIER SCIENCE LTD
资助机构Transport Technology Project of Henan Province ; Foundation and Cutting-Edge Technolo-gies Research Program of Henan Province
源URL[http://ir.ia.ac.cn/handle/173211/59355]  
专题自动化研究所_复杂系统管理与控制国家重点实验室_机器人应用与理论组
通讯作者Shi, Xiayang
作者单位1.Zhengzhou Univ Light Ind, Zhengzhou 45000, Henan, Peoples R China
2.Chinese Acad Sci, Inst Automat, State Key Lab Multimodal Artificial Intelligence S, Beijing 100190, Peoples R China
推荐引用方式
GB/T 7714
Shi, Xiayang,Xia, Zhenlin,Cheng, Pei,et al. Enhancing text generation from knowledge graphs with cross-structure attention distillation[J]. ENGINEERING APPLICATIONS OF ARTIFICIAL INTELLIGENCE,2024,136:11.
APA Shi, Xiayang,Xia, Zhenlin,Cheng, Pei,&Li, Yinlin.(2024).Enhancing text generation from knowledge graphs with cross-structure attention distillation.ENGINEERING APPLICATIONS OF ARTIFICIAL INTELLIGENCE,136,11.
MLA Shi, Xiayang,et al."Enhancing text generation from knowledge graphs with cross-structure attention distillation".ENGINEERING APPLICATIONS OF ARTIFICIAL INTELLIGENCE 136(2024):11.

入库方式: OAI收割

来源:自动化研究所

浏览0
下载0
收藏0
其他版本

除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。