中国科学院机构知识库网格
Chinese Academy of Sciences Institutional Repositories Grid
MGCoT : Multi-Grained Contextual Transformer for table-based text generation

文献类型:期刊论文

作者Mo, Xianjie1,2,3; Xiang, Yang2; Pan, Youcheng2; Hou, Yongshuai2; Luo, Ping1,2,3
刊名EXPERT SYSTEMS WITH APPLICATIONS
出版日期2024-09-15
卷号250页码:10
关键词Multi-grained contexts Transformer Abstractive table question answering Table-to-text generation
ISSN号0957-4174
DOI10.1016/j.eswa.2024.123742
英文摘要Recent advances in Transformer have led to the revolution of table -based text generation. However, most existing Transformer -based architectures ignore the rich contexts among input tokens distributed in multilevel units (e.g., cell, row, or column), leading to sometimes unfaithful text generation that fails to establish accurate association relationships and misses vital information. In this paper, we propose M ulti - G rained Co ntextual T ransformer ( MGCoT ), a novel architecture that fully capitalizes on the multi -grained contexts among input tokens and thus strengthens the capacity of table -based text generation. The key primitive, M ulti - G rained Co ntexts ( MGCo ) module, involves two components: a local context sub -module that adaptively gathers neighboring tokens to form the token -wise local context features, and a global context sub -module that consistently aggregates tokens from a broader range to form the shared global context feature. The former aims at modeling the short-range dependencies that reflect the salience of tokens within similar fine-grained units (e.g., cell and row) attending to the query token, while the latter aims at capturing the long-range dependencies that reflect the significance of each token within similar coarse -grained units (e.g., multiple rows or columns). Based on the fused multi -grained contexts, MGCoT can flexibly and holistically model the content of a table across multi -level structures. On three benchmark datasets, ToTTo, FeTaQA, and Tablesum, MGCoT outperforms strong baselines by a large margin on the quality of the generated texts, demonstrating the effectiveness of multi -grained context modeling. Our source codes are available at https://github.com/Cedric-Mo/MGCoT.
资助项目Major National Science and Technology Project[2022ZD0115305] ; Major Key Project of PCL[PCL2022D01] ; Major Key Project of PCL[PCL2023A09] ; National Natural Science Foundation of China[62106115] ; China Postdoctoral Science Foundation[2023M741843]
WOS研究方向Computer Science ; Engineering ; Operations Research & Management Science
语种英语
WOS记录号WOS:001224645700001
出版者PERGAMON-ELSEVIER SCIENCE LTD
源URL[http://119.78.100.204/handle/2XEOYT63/38972]  
专题中国科学院计算技术研究所期刊论文_英文
通讯作者Xiang, Yang
作者单位1.Chinese Acad Sci, Inst Comp Technol, Key Lab Intelligent Informat Proc Chinese Acad Sci, Beijing, Peoples R China
2.Peng Cheng Lab, Shenzhen, Peoples R China
3.Univ Chinese Acad Sci, Beijing, Peoples R China
推荐引用方式
GB/T 7714
Mo, Xianjie,Xiang, Yang,Pan, Youcheng,et al. MGCoT : Multi-Grained Contextual Transformer for table-based text generation[J]. EXPERT SYSTEMS WITH APPLICATIONS,2024,250:10.
APA Mo, Xianjie,Xiang, Yang,Pan, Youcheng,Hou, Yongshuai,&Luo, Ping.(2024).MGCoT : Multi-Grained Contextual Transformer for table-based text generation.EXPERT SYSTEMS WITH APPLICATIONS,250,10.
MLA Mo, Xianjie,et al."MGCoT : Multi-Grained Contextual Transformer for table-based text generation".EXPERT SYSTEMS WITH APPLICATIONS 250(2024):10.

入库方式: OAI收割

来源:计算技术研究所

浏览0
下载0
收藏0
其他版本

除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。