Query2Triple: Unified Query Encoding for Answering Diverse Complex Queries over Knowledge Graphs
文献类型:会议论文
作者 | Yao Xu1,3![]() ![]() ![]() |
出版日期 | 2023-11-06 |
会议日期 | 2023.11.06-2023.11.10 |
会议地点 | Singapore |
英文摘要 | Complex Query Answering (CQA) is a challenge task of Knowledge Graph (KG). Due to the incompleteness of KGs, query embedding (QE) methods have been proposed to encode queries and entities into the same embedding space, and treat logical operators as neural set operators to obtain answers. However, these methods train KG embeddings and neural set operators concurrently on both simple (one-hop) and complex (multi-hop and logical) queries, which causes performance degradation on simple queries and low training efficiency. In this paper, we propose Query to Triple (Q2T), a novel approach that decouples the training for simple and complex queries. Q2T divides the training into two stages: (1) Pre-training the neural link predictor on simple queries to predict tail entities based on the head entity and relation. (2) Training the query encoder on complex queries to encode diverse complex queries into a unified triple form that can be efficiently solved by the pretrained link predictor. Our proposed Q2T is not only efficient to train, but also modular, thus easily adaptable to various neural link predictors that have been studied well. Extensive experiments demonstrate that, even without explicit modeling for neural set operators, Q2T still achieves state-of-the-art performance on diverse complex queries over three public benchmarks. |
源URL | [http://ir.ia.ac.cn/handle/173211/57449] ![]() |
专题 | 复杂系统认知与决策实验室 |
通讯作者 | Shizhu HE |
作者单位 | 1.Institute of Automation, Chinese Academy of Sciences 2.Meituan 3.School of Artificial Intelligence, University of Chinese Academy of Sciences |
推荐引用方式 GB/T 7714 | Yao Xu,Shizhu HE,Cunguang Wang,et al. Query2Triple: Unified Query Encoding for Answering Diverse Complex Queries over Knowledge Graphs[C]. 见:. Singapore. 2023.11.06-2023.11.10. |
入库方式: OAI收割
来源:自动化研究所
其他版本
除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。